Diagnosed failure

SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers: Unrecognized error type. Please see the error log for more information.

Full log

[==========] Running 57 tests from 4 test suites.
[----------] Global test environment set-up.
[----------] 31 tests from SecurityITest
[ RUN      ] SecurityITest.TestAuthorizationOnListTablets
Loading random data
Initializing database '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/principal' for realm 'KRBTEST.COM',
master key name 'K/M@KRBTEST.COM'
May 04 14:07:18 dist-test-slave-2x32 krb5kdc[26626](info): setting up network...
krb5kdc: setsockopt(10,IPV6_V6ONLY,1) worked
May 04 14:07:18 dist-test-slave-2x32 krb5kdc[26626](info): set up 2 sockets
May 04 14:07:18 dist-test-slave-2x32 krb5kdc[26626](info): commencing operation
krb5kdc: starting...
WARNING: Logging before InitGoogleLogging() is written to STDERR
W20260504 14:07:20.292529 26619 mini_kdc.cc:121] Time spent starting KDC: real 2.013s	user 0.002s	sys 0.004s
WARNING: no policy specified for test-admin@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-admin@KRBTEST.COM" created.
WARNING: no policy specified for test-user@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-user@KRBTEST.COM" created.
WARNING: no policy specified for joe-interloper@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "joe-interloper@KRBTEST.COM" created.
Authenticating as principal slave/admin@KRBTEST.COM with password.
Entry for principal test-user with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/test-user.keytab.
Entry for principal test-user with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/test-user.keytab.
May 04 14:07:20 dist-test-slave-2x32 krb5kdc[26626](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903640, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
2026-05-04T14:07:20Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:07:20Z Disabled control of system clock
WARNING: no policy specified for kudu/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:20.467285 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:46053
--webserver_interface=127.25.254.254
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:46087
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:46053
--encrypt_data_at_rest=true
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:20.577952 26642 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:20.578255 26642 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:20.578368 26642 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:20.581974 26642 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:07:20.582065 26642 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:20.582091 26642 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:20.582111 26642 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:07:20.582167 26642 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:07:20.587534 26642 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:46087
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:46053
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:46053
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.26642
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:20.588766 26642 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:20.589692 26642 file_cache.cc:492] Constructed file cache file cache with capacity 419430
I20260504 14:07:20.596236 26642 server_base.cc:1061] running on GCE node
W20260504 14:07:20.596168 26650 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:20.596385 26647 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:20.596176 26648 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:20.597057 26642 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:20.597965 26642 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:20.599149 26642 hybrid_clock.cc:648] HybridClock initialized: now 1777903640599141 us; error 47 us; skew 500 ppm
May 04 14:07:20 dist-test-slave-2x32 krb5kdc[26626](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903640, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:20.602146 26642 init.cc:377] Logged in from keytab as kudu/127.25.254.254@KRBTEST.COM (short username kudu)
I20260504 14:07:20.603525 26642 webserver.cc:492] Webserver started at http://127.25.254.254:40467/ using document root <none> and password file <none>
I20260504 14:07:20.604105 26642 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:20.604184 26642 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:20.604403 26642 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:20.606199 26642 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "0449fb14e5a940c4b9e7e66ad1bb9aff"
format_stamp: "Formatted at 2026-05-04 14:07:20 on dist-test-slave-2x32"
server_key: "2b726dd101f7b0031af7d31a7dc77ca1"
server_key_iv: "f067c1eb06d2fb6bc3537e9be28b8c50"
server_key_version: "encryptionkey@0"
I20260504 14:07:20.606747 26642 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "0449fb14e5a940c4b9e7e66ad1bb9aff"
format_stamp: "Formatted at 2026-05-04 14:07:20 on dist-test-slave-2x32"
server_key: "2b726dd101f7b0031af7d31a7dc77ca1"
server_key_iv: "f067c1eb06d2fb6bc3537e9be28b8c50"
server_key_version: "encryptionkey@0"
I20260504 14:07:20.610283 26642 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.003s	sys 0.000s
I20260504 14:07:20.612752 26657 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:20.613929 26642 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:07:20.614108 26642 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "0449fb14e5a940c4b9e7e66ad1bb9aff"
format_stamp: "Formatted at 2026-05-04 14:07:20 on dist-test-slave-2x32"
server_key: "2b726dd101f7b0031af7d31a7dc77ca1"
server_key_iv: "f067c1eb06d2fb6bc3537e9be28b8c50"
server_key_version: "encryptionkey@0"
I20260504 14:07:20.614339 26642 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:20.648518 26642 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:20.651849 26642 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:20.652079 26642 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:20.660032 26642 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:46053
I20260504 14:07:20.660027 26709 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:46053 every 8 connection(s)
I20260504 14:07:20.661104 26642 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:07:20.663968 26710 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:20.664373 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 26642
I20260504 14:07:20.664547 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:07:20.665553 26619 external_mini_cluster.cc:1468] Setting key 015847fb2bdd9a2930ddf93057ed568b
I20260504 14:07:20.669900 26710 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff: Bootstrap starting.
I20260504 14:07:20.672207 26710 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:20.672952 26710 log.cc:826] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:20.674810 26710 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff: No bootstrap required, opened a new log
I20260504 14:07:20.677635 26710 raft_consensus.cc:359] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "0449fb14e5a940c4b9e7e66ad1bb9aff" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 46053 } }
May 04 14:07:20 dist-test-slave-2x32 krb5kdc[26626](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903640, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:20.677865 26710 raft_consensus.cc:385] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:20.677923 26710 raft_consensus.cc:740] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 0449fb14e5a940c4b9e7e66ad1bb9aff, State: Initialized, Role: FOLLOWER
I20260504 14:07:20.678406 26710 consensus_queue.cc:260] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "0449fb14e5a940c4b9e7e66ad1bb9aff" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 46053 } }
I20260504 14:07:20.678558 26710 raft_consensus.cc:399] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:07:20.678633 26710 raft_consensus.cc:493] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:07:20.678755 26710 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:20.679935 26710 raft_consensus.cc:515] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "0449fb14e5a940c4b9e7e66ad1bb9aff" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 46053 } }
I20260504 14:07:20.680356 26710 leader_election.cc:304] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 0449fb14e5a940c4b9e7e66ad1bb9aff; no voters: 
I20260504 14:07:20.680751 26710 leader_election.cc:290] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:07:20.680872 26715 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:20.681114 26715 raft_consensus.cc:697] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff [term 1 LEADER]: Becoming Leader. State: Replica: 0449fb14e5a940c4b9e7e66ad1bb9aff, State: Running, Role: LEADER
I20260504 14:07:20.681478 26715 consensus_queue.cc:237] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "0449fb14e5a940c4b9e7e66ad1bb9aff" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 46053 } }
I20260504 14:07:20.681977 26710 sys_catalog.cc:565] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:07:20.682710 26716 sys_catalog.cc:455] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff [sys.catalog]: SysCatalogTable state changed. Reason: New leader 0449fb14e5a940c4b9e7e66ad1bb9aff. Latest consensus state: current_term: 1 leader_uuid: "0449fb14e5a940c4b9e7e66ad1bb9aff" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "0449fb14e5a940c4b9e7e66ad1bb9aff" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 46053 } } }
I20260504 14:07:20.682832 26716 sys_catalog.cc:458] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:20.683344 26722 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:07:20.684855 26717 sys_catalog.cc:455] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "0449fb14e5a940c4b9e7e66ad1bb9aff" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "0449fb14e5a940c4b9e7e66ad1bb9aff" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 46053 } } }
I20260504 14:07:20.684944 26717 sys_catalog.cc:458] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:20.685297 26712 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:20.667843 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:48614 (local address 127.25.254.254:46053)
0504 14:07:20.668267 (+   424us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:20.668280 (+    13us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:20.668797 (+   517us) server_negotiation.cc:408] Connection header received
0504 14:07:20.671179 (+  2382us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:20.671193 (+    14us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:20.671464 (+   271us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:20.671789 (+   325us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:20.673542 (+  1753us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:20.674343 (+   801us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:20.675035 (+   692us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:20.675313 (+   278us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:20.678877 (+  3564us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:20.678901 (+    24us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:20.678941 (+    40us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:20.678978 (+    37us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:20.681655 (+  2677us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:20.682283 (+   628us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:20.682289 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:20.682295 (+     6us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:20.682392 (+    97us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:20.682764 (+   372us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:20.682767 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:20.682769 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:20.683204 (+   435us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:20.683415 (+   211us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:20.684143 (+   728us) server_negotiation.cc:300] Negotiation successful
0504 14:07:20.684394 (+   251us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":280,"thread_start_us":148,"threads_started":1}
I20260504 14:07:20.687018 26722 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:07:20.692502 26722 catalog_manager.cc:1357] Generated new cluster ID: fecb90bc98ee4e328def6fdd138e22c5
I20260504 14:07:20.692590 26722 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:07:20.700470 26722 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:07:20.701413 26722 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:07:20.710474 26722 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 0449fb14e5a940c4b9e7e66ad1bb9aff: Generated new TSK 0
I20260504 14:07:20.711107 26722 catalog_manager.cc:1524] Initializing in-progress tserver states...
WARNING: no policy specified for kudu/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:20.770696 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:0
--local_ip_for_outbound_sockets=127.25.254.193
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:46053
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:46087
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--tserver_enforce_access_control with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:20.880152 26738 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:20.880406 26738 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:20.880504 26738 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:20.884370 26738 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:20.884452 26738 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:20.884600 26738 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:07:20.889396 26738 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:46087
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-0/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:46053
--tserver_enforce_access_control=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.26738
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:20.890689 26738 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:20.891605 26738 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:20.898721 26746 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:20.898772 26743 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:20.898792 26744 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:20.899202 26738 server_base.cc:1061] running on GCE node
I20260504 14:07:20.899668 26738 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:20.900246 26738 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:20.901448 26738 hybrid_clock.cc:648] HybridClock initialized: now 1777903640901435 us; error 26 us; skew 500 ppm
May 04 14:07:20 dist-test-slave-2x32 krb5kdc[26626](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903640, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:20.904469 26738 init.cc:377] Logged in from keytab as kudu/127.25.254.193@KRBTEST.COM (short username kudu)
I20260504 14:07:20.905601 26738 webserver.cc:492] Webserver started at http://127.25.254.193:42113/ using document root <none> and password file <none>
I20260504 14:07:20.906255 26738 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:20.906334 26738 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:20.906554 26738 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:20.908380 26738 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-0/data/instance:
uuid: "cf3c1ff942fd4621a8a0d84d87b5cea6"
format_stamp: "Formatted at 2026-05-04 14:07:20 on dist-test-slave-2x32"
server_key: "d549598f53e7a39230c5320d030f3d44"
server_key_iv: "b29d742cd8ce4f8ee5d342142c7cb868"
server_key_version: "encryptionkey@0"
I20260504 14:07:20.908917 26738 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance:
uuid: "cf3c1ff942fd4621a8a0d84d87b5cea6"
format_stamp: "Formatted at 2026-05-04 14:07:20 on dist-test-slave-2x32"
server_key: "d549598f53e7a39230c5320d030f3d44"
server_key_iv: "b29d742cd8ce4f8ee5d342142c7cb868"
server_key_version: "encryptionkey@0"
I20260504 14:07:20.912402 26738 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.002s	sys 0.004s
I20260504 14:07:20.914673 26753 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:20.915655 26738 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:07:20.915792 26738 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "cf3c1ff942fd4621a8a0d84d87b5cea6"
format_stamp: "Formatted at 2026-05-04 14:07:20 on dist-test-slave-2x32"
server_key: "d549598f53e7a39230c5320d030f3d44"
server_key_iv: "b29d742cd8ce4f8ee5d342142c7cb868"
server_key_version: "encryptionkey@0"
I20260504 14:07:20.915900 26738 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:20.931911 26738 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:20.934912 26738 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:20.935128 26738 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:20.935755 26738 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:20.936685 26738 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:20.936771 26738 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:20.936837 26738 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:20.936887 26738 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:20.947849 26738 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:40233
I20260504 14:07:20.947862 26866 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:40233 every 8 connection(s)
I20260504 14:07:20.948885 26738 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
I20260504 14:07:20.957158 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 26738
I20260504 14:07:20.957343 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance
I20260504 14:07:20.957628 26619 external_mini_cluster.cc:1468] Setting key ff6373a579cd89b81aef18272925176e
May 04 14:07:20 dist-test-slave-2x32 krb5kdc[26626](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903640, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:20.963142 26712 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:20.951051 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:57513 (local address 127.25.254.254:46053)
0504 14:07:20.951242 (+   191us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:20.951247 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:20.951979 (+   732us) server_negotiation.cc:408] Connection header received
0504 14:07:20.952843 (+   864us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:20.952846 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:20.952897 (+    51us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:20.952978 (+    81us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:20.954698 (+  1720us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:20.955206 (+   508us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:20.956057 (+   851us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:20.956312 (+   255us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:20.959216 (+  2904us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:20.959246 (+    30us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:20.959256 (+    10us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:20.959283 (+    27us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:20.960982 (+  1699us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:20.961571 (+   589us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:20.961576 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:20.961578 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:20.961634 (+    56us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:20.961965 (+   331us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:20.961969 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:20.961971 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:20.962137 (+   166us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:20.962310 (+   173us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:20.962833 (+   523us) server_negotiation.cc:300] Negotiation successful
0504 14:07:20.962976 (+   143us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":79}
I20260504 14:07:20.963833 26869 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:20.951381 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:46053 (local address 127.25.254.193:57513)
0504 14:07:20.951832 (+   451us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:20.951866 (+    34us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:20.952648 (+   782us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:20.953133 (+   485us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:20.953145 (+    12us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:20.953582 (+   437us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:20.954545 (+   963us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:20.954559 (+    14us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:20.955495 (+   936us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:20.955498 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:20.955906 (+   408us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:20.955912 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:20.956126 (+   214us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:20.956950 (+   824us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:20.956979 (+    29us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:20.959050 (+  2071us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:20.961142 (+  2092us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:20.961149 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:20.961163 (+    14us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:20.961451 (+   288us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:20.961745 (+   294us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:20.961748 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:20.961750 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:20.961862 (+   112us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:20.962310 (+   448us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:20.962317 (+     7us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:20.962564 (+   247us) client_negotiation.cc:770] Sending connection context
0504 14:07:20.962787 (+   223us) client_negotiation.cc:241] Negotiation successful
0504 14:07:20.963039 (+   252us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":275,"thread_start_us":113,"threads_started":1}
I20260504 14:07:20.965085 26867 heartbeater.cc:344] Connected to a master server at 127.25.254.254:46053
I20260504 14:07:20.965406 26867 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:20.965953 26867 heartbeater.cc:507] Master 127.25.254.254:46053 requested a full tablet report, sending...
WARNING: no policy specified for kudu/127.25.254.194@KRBTEST.COM; defaulting to no policy
I20260504 14:07:20.967711 26674 ts_manager.cc:194] Registered new tserver with Master: cf3c1ff942fd4621a8a0d84d87b5cea6 (127.25.254.193:40233)
I20260504 14:07:20.968923 26674 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.193@KRBTEST.COM'} at 127.25.254.193:57513
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:21.014940 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:0
--local_ip_for_outbound_sockets=127.25.254.194
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:46053
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:46087
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--tserver_enforce_access_control with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:21.120289 26874 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:21.120558 26874 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:21.120677 26874 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:21.124152 26874 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:21.124259 26874 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:21.124382 26874 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:07:21.128921 26874 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:46087
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-1/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:46053
--tserver_enforce_access_control=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.26874
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:21.130105 26874 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:21.131022 26874 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:21.138139 26882 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:21.138319 26874 server_base.cc:1061] running on GCE node
W20260504 14:07:21.138146 26880 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:21.138319 26879 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:21.138873 26874 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:21.139469 26874 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:21.140674 26874 hybrid_clock.cc:648] HybridClock initialized: now 1777903641140646 us; error 40 us; skew 500 ppm
May 04 14:07:21 dist-test-slave-2x32 krb5kdc[26626](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903641, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:21.143867 26874 init.cc:377] Logged in from keytab as kudu/127.25.254.194@KRBTEST.COM (short username kudu)
I20260504 14:07:21.145006 26874 webserver.cc:492] Webserver started at http://127.25.254.194:44479/ using document root <none> and password file <none>
I20260504 14:07:21.145596 26874 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:21.145669 26874 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:21.145885 26874 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:21.147702 26874 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-1/data/instance:
uuid: "2be4da5e30b446f7b80aaa6a376568ad"
format_stamp: "Formatted at 2026-05-04 14:07:21 on dist-test-slave-2x32"
server_key: "4b6735094a0a0854f78311ae2e6c890e"
server_key_iv: "4426f15e2d8b2f161c69ee84165476dd"
server_key_version: "encryptionkey@0"
I20260504 14:07:21.148196 26874 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance:
uuid: "2be4da5e30b446f7b80aaa6a376568ad"
format_stamp: "Formatted at 2026-05-04 14:07:21 on dist-test-slave-2x32"
server_key: "4b6735094a0a0854f78311ae2e6c890e"
server_key_iv: "4426f15e2d8b2f161c69ee84165476dd"
server_key_version: "encryptionkey@0"
I20260504 14:07:21.151635 26874 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.004s	sys 0.000s
I20260504 14:07:21.154088 26889 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:21.155288 26874 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:07:21.155433 26874 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "2be4da5e30b446f7b80aaa6a376568ad"
format_stamp: "Formatted at 2026-05-04 14:07:21 on dist-test-slave-2x32"
server_key: "4b6735094a0a0854f78311ae2e6c890e"
server_key_iv: "4426f15e2d8b2f161c69ee84165476dd"
server_key_version: "encryptionkey@0"
I20260504 14:07:21.155544 26874 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:21.171525 26874 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:21.174566 26874 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:21.174772 26874 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:21.175338 26874 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:21.176242 26874 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:21.176319 26874 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:21.176386 26874 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:21.176433 26874 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:21.186615 26874 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:42247
I20260504 14:07:21.186681 27002 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:42247 every 8 connection(s)
I20260504 14:07:21.187703 26874 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
I20260504 14:07:21.192381 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 26874
I20260504 14:07:21.192494 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance
I20260504 14:07:21.192745 26619 external_mini_cluster.cc:1468] Setting key 614d1f236020227edda93b840446a324
May 04 14:07:21 dist-test-slave-2x32 krb5kdc[26626](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903641, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:21.201920 26712 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:21.189616 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:44601 (local address 127.25.254.254:46053)
0504 14:07:21.189777 (+   161us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:21.189781 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:21.190518 (+   737us) server_negotiation.cc:408] Connection header received
0504 14:07:21.191424 (+   906us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:21.191428 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:21.191488 (+    60us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:21.191599 (+   111us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:21.193590 (+  1991us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:21.194325 (+   735us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:21.195230 (+   905us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:21.195418 (+   188us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:21.197900 (+  2482us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:21.197925 (+    25us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:21.197931 (+     6us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:21.197967 (+    36us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:21.199697 (+  1730us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:21.200429 (+   732us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:21.200432 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:21.200434 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:21.200475 (+    41us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:21.200833 (+   358us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:21.200836 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:21.200837 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:21.200977 (+   140us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:21.201056 (+    79us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:21.201691 (+   635us) server_negotiation.cc:300] Negotiation successful
0504 14:07:21.201796 (+   105us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":56}
I20260504 14:07:21.202864 27005 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:21.189894 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:46053 (local address 127.25.254.194:44601)
0504 14:07:21.190380 (+   486us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:21.190413 (+    33us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:21.191218 (+   805us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:21.191758 (+   540us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:21.191767 (+     9us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:21.192179 (+   412us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:21.193350 (+  1171us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:21.193371 (+    21us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:21.194460 (+  1089us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:21.194464 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:21.195078 (+   614us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:21.195091 (+    13us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:21.195316 (+   225us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:21.195947 (+   631us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:21.195977 (+    30us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:21.197722 (+  1745us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:21.199841 (+  2119us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:21.199852 (+    11us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:21.199868 (+    16us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:21.200250 (+   382us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:21.200575 (+   325us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:21.200580 (+     5us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:21.200585 (+     5us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:21.200733 (+   148us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:21.201088 (+   355us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:21.201095 (+     7us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:21.201396 (+   301us) client_negotiation.cc:770] Sending connection context
0504 14:07:21.201648 (+   252us) client_negotiation.cc:241] Negotiation successful
0504 14:07:21.201898 (+   250us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":313,"thread_start_us":107,"threads_started":1}
WARNING: no policy specified for kudu/127.25.254.195@KRBTEST.COM; defaulting to no policy
I20260504 14:07:21.204073 27003 heartbeater.cc:344] Connected to a master server at 127.25.254.254:46053
I20260504 14:07:21.204350 27003 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:21.204957 27003 heartbeater.cc:507] Master 127.25.254.254:46053 requested a full tablet report, sending...
I20260504 14:07:21.206130 26674 ts_manager.cc:194] Registered new tserver with Master: 2be4da5e30b446f7b80aaa6a376568ad (127.25.254.194:42247)
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.195@KRBTEST.COM" created.
I20260504 14:07:21.206964 26674 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.194@KRBTEST.COM'} at 127.25.254.194:44601
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:21.251579 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:0
--local_ip_for_outbound_sockets=127.25.254.195
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:46053
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:46087
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--tserver_enforce_access_control with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:21.360105 27010 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:21.360451 27010 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:21.360517 27010 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:21.364013 27010 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:21.364089 27010 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:21.364176 27010 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:07:21.368741 27010 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:46087
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-2/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:46053
--tserver_enforce_access_control=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.27010
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:21.369890 27010 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:21.370779 27010 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:21.377381 27016 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:21.377449 27015 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:21.377386 27018 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:21.378122 27010 server_base.cc:1061] running on GCE node
I20260504 14:07:21.378612 27010 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:21.379191 27010 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:21.380374 27010 hybrid_clock.cc:648] HybridClock initialized: now 1777903641380345 us; error 43 us; skew 500 ppm
May 04 14:07:21 dist-test-slave-2x32 krb5kdc[26626](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903641, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:21.383760 27010 init.cc:377] Logged in from keytab as kudu/127.25.254.195@KRBTEST.COM (short username kudu)
I20260504 14:07:21.385076 27010 webserver.cc:492] Webserver started at http://127.25.254.195:44221/ using document root <none> and password file <none>
I20260504 14:07:21.385658 27010 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:21.385708 27010 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:21.385916 27010 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:21.387826 27010 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-2/data/instance:
uuid: "fd393d68dea84fa5942972de3f96a474"
format_stamp: "Formatted at 2026-05-04 14:07:21 on dist-test-slave-2x32"
server_key: "a6edf09d3ae8fd9486c9e704ecf65508"
server_key_iv: "5c7323bb0a4432b1d482f1cdb29dc872"
server_key_version: "encryptionkey@0"
I20260504 14:07:21.388374 27010 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance:
uuid: "fd393d68dea84fa5942972de3f96a474"
format_stamp: "Formatted at 2026-05-04 14:07:21 on dist-test-slave-2x32"
server_key: "a6edf09d3ae8fd9486c9e704ecf65508"
server_key_iv: "5c7323bb0a4432b1d482f1cdb29dc872"
server_key_version: "encryptionkey@0"
I20260504 14:07:21.391986 27010 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.005s	sys 0.000s
I20260504 14:07:21.394469 27025 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:21.395613 27010 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.001s
I20260504 14:07:21.395768 27010 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "fd393d68dea84fa5942972de3f96a474"
format_stamp: "Formatted at 2026-05-04 14:07:21 on dist-test-slave-2x32"
server_key: "a6edf09d3ae8fd9486c9e704ecf65508"
server_key_iv: "5c7323bb0a4432b1d482f1cdb29dc872"
server_key_version: "encryptionkey@0"
I20260504 14:07:21.395891 27010 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:21.416249 27010 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:21.419749 27010 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:21.419972 27010 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:21.420658 27010 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:21.421641 27010 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:21.421717 27010 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:21.421790 27010 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:21.421847 27010 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:21.431946 27010 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:45067
I20260504 14:07:21.431977 27138 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:45067 every 8 connection(s)
I20260504 14:07:21.432945 27010 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
I20260504 14:07:21.437981 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 27010
I20260504 14:07:21.438115 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnListTablets.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance
I20260504 14:07:21.438442 26619 external_mini_cluster.cc:1468] Setting key 8cc7dab710c2d7beace3cd2ec6dc7f22
May 04 14:07:21 dist-test-slave-2x32 krb5kdc[26626](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903641, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:21.446511 26712 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:21.434887 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:35087 (local address 127.25.254.254:46053)
0504 14:07:21.435133 (+   246us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:21.435137 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:21.435667 (+   530us) server_negotiation.cc:408] Connection header received
0504 14:07:21.436657 (+   990us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:21.436661 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:21.436721 (+    60us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:21.436826 (+   105us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:21.438523 (+  1697us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:21.439154 (+   631us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:21.439986 (+   832us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:21.440165 (+   179us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:21.442843 (+  2678us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:21.442860 (+    17us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:21.442863 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:21.442893 (+    30us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:21.444413 (+  1520us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:21.444971 (+   558us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:21.444975 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:21.444976 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:21.445030 (+    54us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:21.445404 (+   374us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:21.445407 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:21.445408 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:21.445602 (+   194us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:21.445690 (+    88us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:21.446263 (+   573us) server_negotiation.cc:300] Negotiation successful
0504 14:07:21.446377 (+   114us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":152}
I20260504 14:07:21.447233 27141 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:21.435128 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:46053 (local address 127.25.254.195:35087)
0504 14:07:21.435534 (+   406us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:21.435568 (+    34us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:21.436413 (+   845us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:21.436987 (+   574us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:21.436996 (+     9us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:21.437381 (+   385us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:21.438341 (+   960us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:21.438355 (+    14us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:21.439297 (+   942us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:21.439300 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:21.439865 (+   565us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:21.439873 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:21.440100 (+   227us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:21.440836 (+   736us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:21.440862 (+    26us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:21.442663 (+  1801us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:21.444565 (+  1902us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:21.444571 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:21.444582 (+    11us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:21.444839 (+   257us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:21.445131 (+   292us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:21.445134 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:21.445135 (+     1us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:21.445255 (+   120us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:21.445710 (+   455us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:21.445715 (+     5us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:21.445933 (+   218us) client_negotiation.cc:770] Sending connection context
0504 14:07:21.446109 (+   176us) client_negotiation.cc:241] Negotiation successful
0504 14:07:21.446462 (+   353us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":245,"thread_start_us":106,"threads_started":1}
I20260504 14:07:21.448575 27139 heartbeater.cc:344] Connected to a master server at 127.25.254.254:46053
I20260504 14:07:21.448817 27139 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:21.449321 27139 heartbeater.cc:507] Master 127.25.254.254:46053 requested a full tablet report, sending...
I20260504 14:07:21.450434 26674 ts_manager.cc:194] Registered new tserver with Master: fd393d68dea84fa5942972de3f96a474 (127.25.254.195:45067)
I20260504 14:07:21.451035 26674 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.195@KRBTEST.COM'} at 127.25.254.195:35087
I20260504 14:07:21.453349 26619 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
May 04 14:07:21 dist-test-slave-2x32 krb5kdc[26626](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903641, etypes {rep=17 tkt=17 ses=17}, test-user@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-user@KRBTEST.COM: 
May 04 14:07:21 dist-test-slave-2x32 krb5kdc[26626](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903641, etypes {rep=17 tkt=17 ses=17}, test-user@KRBTEST.COM for kudu/127.25.254.193@KRBTEST.COM
I20260504 14:07:21.479665 27146 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:21.469620 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:36194 (local address 127.25.254.193:40233)
0504 14:07:21.469990 (+   370us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:21.469997 (+     7us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:21.470023 (+    26us) server_negotiation.cc:408] Connection header received
0504 14:07:21.470085 (+    62us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:21.470090 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:21.470351 (+   261us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:21.470488 (+   137us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:21.471310 (+   822us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:21.472134 (+   824us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:21.472914 (+   780us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:21.473097 (+   183us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:21.475332 (+  2235us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:21.475361 (+    29us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:21.475373 (+    12us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:21.475401 (+    28us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:21.477401 (+  2000us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:21.477935 (+   534us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:21.477941 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:21.477943 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:21.478000 (+    57us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:21.478521 (+   521us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:21.478525 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:21.478526 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:21.478798 (+   272us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:21.478981 (+   183us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:21.479239 (+   258us) server_negotiation.cc:300] Negotiation successful
0504 14:07:21.479419 (+   180us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":266,"thread_start_us":141,"threads_started":1}
W20260504 14:07:21.480670 26781 server_base.cc:1143] Unauthorized access attempt to method kudu.tserver.TabletServerService.ListTablets from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:36194
May 04 14:07:21 dist-test-slave-2x32 krb5kdc[26626](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903641, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
May 04 14:07:21 dist-test-slave-2x32 krb5kdc[26626](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903641, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.193@KRBTEST.COM
I20260504 14:07:21.506117 27146 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:21.497424 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:36198 (local address 127.25.254.193:40233)
0504 14:07:21.497628 (+   204us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:21.497632 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:21.497752 (+   120us) server_negotiation.cc:408] Connection header received
0504 14:07:21.497839 (+    87us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:21.497842 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:21.497903 (+    61us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:21.497986 (+    83us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:21.498936 (+   950us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:21.499442 (+   506us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:21.500192 (+   750us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:21.500478 (+   286us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:21.502839 (+  2361us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:21.502860 (+    21us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:21.502862 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:21.502895 (+    33us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:21.504406 (+  1511us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:21.504881 (+   475us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:21.504884 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:21.504886 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:21.504939 (+    53us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:21.505208 (+   269us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:21.505211 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:21.505212 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:21.505373 (+   161us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:21.505480 (+   107us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:21.505763 (+   283us) server_negotiation.cc:300] Negotiation successful
0504 14:07:21.505872 (+   109us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":87}
I20260504 14:07:21.507454 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 26738
I20260504 14:07:21.513665 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 26874
I20260504 14:07:21.522261 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 27010
I20260504 14:07:21.528450 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 26642
2026-05-04T14:07:21Z chronyd exiting
[       OK ] SecurityITest.TestAuthorizationOnListTablets (3278 ms)
[ RUN      ] SecurityITest.TestAuthorizationOnChecksum
Loading random data
Initializing database '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/principal' for realm 'KRBTEST.COM',
master key name 'K/M@KRBTEST.COM'
May 04 14:07:21 dist-test-slave-2x32 krb5kdc[27153](info): setting up network...
krb5kdc: setsockopt(10,IPV6_V6ONLY,1) worked
May 04 14:07:21 dist-test-slave-2x32 krb5kdc[27153](info): set up 2 sockets
May 04 14:07:21 dist-test-slave-2x32 krb5kdc[27153](info): commencing operation
krb5kdc: starting...
W20260504 14:07:23.576654 26619 mini_kdc.cc:121] Time spent starting KDC: real 2.025s	user 0.001s	sys 0.005s
WARNING: no policy specified for test-admin@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-admin@KRBTEST.COM" created.
WARNING: no policy specified for test-user@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-user@KRBTEST.COM" created.
WARNING: no policy specified for joe-interloper@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "joe-interloper@KRBTEST.COM" created.
Authenticating as principal slave/admin@KRBTEST.COM with password.
Entry for principal test-user with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/test-user.keytab.
Entry for principal test-user with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/test-user.keytab.
May 04 14:07:23 dist-test-slave-2x32 krb5kdc[27153](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903643, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
2026-05-04T14:07:23Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:07:23Z Disabled control of system clock
WARNING: no policy specified for kudu/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:23.735772 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:43049
--webserver_interface=127.25.254.254
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:34293
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:43049
--encrypt_data_at_rest=true
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:23.850255 27169 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:23.850569 27169 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:23.850674 27169 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:23.854324 27169 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:07:23.854446 27169 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:23.854492 27169 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:23.854557 27169 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:07:23.854584 27169 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:07:23.859362 27169 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:34293
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:43049
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:43049
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.27169
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:23.860558 27169 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:23.861544 27169 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:23.868358 27174 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:23.868300 27175 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:23.868466 27169 server_base.cc:1061] running on GCE node
W20260504 14:07:23.868301 27177 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:23.869216 27169 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:23.870251 27169 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:23.871591 27169 hybrid_clock.cc:648] HybridClock initialized: now 1777903643871568 us; error 43 us; skew 500 ppm
May 04 14:07:23 dist-test-slave-2x32 krb5kdc[27153](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903643, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:23.874931 27169 init.cc:377] Logged in from keytab as kudu/127.25.254.254@KRBTEST.COM (short username kudu)
I20260504 14:07:23.876107 27169 webserver.cc:492] Webserver started at http://127.25.254.254:41425/ using document root <none> and password file <none>
I20260504 14:07:23.876727 27169 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:23.876809 27169 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:23.877028 27169 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:23.878857 27169 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "7bdfb39b971b4417abcdd7729414c95a"
format_stamp: "Formatted at 2026-05-04 14:07:23 on dist-test-slave-2x32"
server_key: "b3f2e80e03a6f472a35dce6bcb35d7cc"
server_key_iv: "9318390ee3b405d040d10386eed2bcbc"
server_key_version: "encryptionkey@0"
I20260504 14:07:23.879369 27169 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "7bdfb39b971b4417abcdd7729414c95a"
format_stamp: "Formatted at 2026-05-04 14:07:23 on dist-test-slave-2x32"
server_key: "b3f2e80e03a6f472a35dce6bcb35d7cc"
server_key_iv: "9318390ee3b405d040d10386eed2bcbc"
server_key_version: "encryptionkey@0"
I20260504 14:07:23.882995 27169 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.003s	sys 0.001s
I20260504 14:07:23.885314 27184 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:23.886423 27169 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.001s
I20260504 14:07:23.886559 27169 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "7bdfb39b971b4417abcdd7729414c95a"
format_stamp: "Formatted at 2026-05-04 14:07:23 on dist-test-slave-2x32"
server_key: "b3f2e80e03a6f472a35dce6bcb35d7cc"
server_key_iv: "9318390ee3b405d040d10386eed2bcbc"
server_key_version: "encryptionkey@0"
I20260504 14:07:23.886677 27169 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:23.932053 27169 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:23.940052 27169 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:23.940339 27169 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:23.948724 27169 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:43049
I20260504 14:07:23.948786 27236 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:43049 every 8 connection(s)
I20260504 14:07:23.949919 27169 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:07:23.952885 27237 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:23.953024 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 27169
I20260504 14:07:23.953156 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:07:23.953388 26619 external_mini_cluster.cc:1468] Setting key 99d8c224298cde588977e441e11ffde6
I20260504 14:07:23.959743 27237 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a: Bootstrap starting.
May 04 14:07:23 dist-test-slave-2x32 krb5kdc[27153](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903643, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:23.962843 27237 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:23.963680 27237 log.cc:826] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:23.966135 27237 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a: No bootstrap required, opened a new log
I20260504 14:07:23.968544 27240 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:23.954804 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:54164 (local address 127.25.254.254:43049)
0504 14:07:23.955251 (+   447us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:23.955263 (+    12us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:23.955305 (+    42us) server_negotiation.cc:408] Connection header received
0504 14:07:23.956439 (+  1134us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:23.956472 (+    33us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:23.956816 (+   344us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:23.957101 (+   285us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:23.958346 (+  1245us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:23.959106 (+   760us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:23.960023 (+   917us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:23.960286 (+   263us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:23.963257 (+  2971us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:23.963277 (+    20us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:23.963291 (+    14us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:23.963327 (+    36us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:23.965666 (+  2339us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:23.966133 (+   467us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:23.966140 (+     7us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:23.966147 (+     7us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:23.966362 (+   215us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:23.966623 (+   261us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:23.966626 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:23.966628 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:23.967062 (+   434us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:23.967249 (+   187us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:23.967518 (+   269us) server_negotiation.cc:300] Negotiation successful
0504 14:07:23.967768 (+   250us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":284,"thread_start_us":142,"threads_started":1}
I20260504 14:07:23.969543 27237 raft_consensus.cc:359] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7bdfb39b971b4417abcdd7729414c95a" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 43049 } }
I20260504 14:07:23.969785 27237 raft_consensus.cc:385] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:23.969870 27237 raft_consensus.cc:740] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7bdfb39b971b4417abcdd7729414c95a, State: Initialized, Role: FOLLOWER
I20260504 14:07:23.970359 27237 consensus_queue.cc:260] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7bdfb39b971b4417abcdd7729414c95a" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 43049 } }
I20260504 14:07:23.970484 27237 raft_consensus.cc:399] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:07:23.970577 27237 raft_consensus.cc:493] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:07:23.970690 27237 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:23.971645 27237 raft_consensus.cc:515] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7bdfb39b971b4417abcdd7729414c95a" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 43049 } }
I20260504 14:07:23.971938 27237 leader_election.cc:304] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 7bdfb39b971b4417abcdd7729414c95a; no voters: 
I20260504 14:07:23.972261 27237 leader_election.cc:290] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:07:23.972430 27242 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:23.972668 27242 raft_consensus.cc:697] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a [term 1 LEADER]: Becoming Leader. State: Replica: 7bdfb39b971b4417abcdd7729414c95a, State: Running, Role: LEADER
I20260504 14:07:23.972992 27242 consensus_queue.cc:237] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7bdfb39b971b4417abcdd7729414c95a" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 43049 } }
I20260504 14:07:23.973445 27237 sys_catalog.cc:565] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:07:23.974052 27242 sys_catalog.cc:455] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a [sys.catalog]: SysCatalogTable state changed. Reason: New leader 7bdfb39b971b4417abcdd7729414c95a. Latest consensus state: current_term: 1 leader_uuid: "7bdfb39b971b4417abcdd7729414c95a" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7bdfb39b971b4417abcdd7729414c95a" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 43049 } } }
I20260504 14:07:23.974293 27242 sys_catalog.cc:458] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:23.974491 27244 sys_catalog.cc:455] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "7bdfb39b971b4417abcdd7729414c95a" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7bdfb39b971b4417abcdd7729414c95a" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 43049 } } }
I20260504 14:07:23.974597 27244 sys_catalog.cc:458] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:23.974928 27247 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:07:23.978035 27247 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:07:23.983714 27247 catalog_manager.cc:1357] Generated new cluster ID: 5a941e7fc41544b688cdf8b9e37f795a
I20260504 14:07:23.983806 27247 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:07:23.989594 27247 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:07:23.990587 27247 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:07:24.001117 27247 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 7bdfb39b971b4417abcdd7729414c95a: Generated new TSK 0
I20260504 14:07:24.001797 27247 catalog_manager.cc:1524] Initializing in-progress tserver states...
WARNING: no policy specified for kudu/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:24.074990 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:0
--local_ip_for_outbound_sockets=127.25.254.193
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:43049
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:34293
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--tserver_enforce_access_control with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:24.183934 27265 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:24.184177 27265 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:24.184239 27265 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:24.187736 27265 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:24.187809 27265 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:24.187894 27265 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:07:24.192471 27265 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:34293
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-0/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:43049
--tserver_enforce_access_control=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.27265
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:24.193718 27265 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:24.194613 27265 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:24.201476 27273 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:24.201483 27270 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:24.201490 27271 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:24.202054 27265 server_base.cc:1061] running on GCE node
I20260504 14:07:24.202607 27265 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:24.203231 27265 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:24.204401 27265 hybrid_clock.cc:648] HybridClock initialized: now 1777903644204359 us; error 64 us; skew 500 ppm
May 04 14:07:24 dist-test-slave-2x32 krb5kdc[27153](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903644, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:24.207355 27265 init.cc:377] Logged in from keytab as kudu/127.25.254.193@KRBTEST.COM (short username kudu)
I20260504 14:07:24.208479 27265 webserver.cc:492] Webserver started at http://127.25.254.193:35797/ using document root <none> and password file <none>
I20260504 14:07:24.209022 27265 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:24.209069 27265 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:24.209227 27265 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:24.211054 27265 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-0/data/instance:
uuid: "03e0ce290a944dfd86e05ce3a2cae436"
format_stamp: "Formatted at 2026-05-04 14:07:24 on dist-test-slave-2x32"
server_key: "22ca6b5d278c9ca0013240dd49c0c850"
server_key_iv: "3732944c6f41307861c1427a79ef41bd"
server_key_version: "encryptionkey@0"
I20260504 14:07:24.211506 27265 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance:
uuid: "03e0ce290a944dfd86e05ce3a2cae436"
format_stamp: "Formatted at 2026-05-04 14:07:24 on dist-test-slave-2x32"
server_key: "22ca6b5d278c9ca0013240dd49c0c850"
server_key_iv: "3732944c6f41307861c1427a79ef41bd"
server_key_version: "encryptionkey@0"
I20260504 14:07:24.215068 27265 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.004s	sys 0.000s
I20260504 14:07:24.217289 27280 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:24.218706 27265 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:07:24.218856 27265 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "03e0ce290a944dfd86e05ce3a2cae436"
format_stamp: "Formatted at 2026-05-04 14:07:24 on dist-test-slave-2x32"
server_key: "22ca6b5d278c9ca0013240dd49c0c850"
server_key_iv: "3732944c6f41307861c1427a79ef41bd"
server_key_version: "encryptionkey@0"
I20260504 14:07:24.218974 27265 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:24.235874 27265 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:24.239233 27265 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:24.239478 27265 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:24.240099 27265 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:24.241101 27265 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:24.241179 27265 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:24.241250 27265 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:24.241304 27265 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:24.252650 27265 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:38655
I20260504 14:07:24.252663 27393 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:38655 every 8 connection(s)
I20260504 14:07:24.253677 27265 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
I20260504 14:07:24.261417 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 27265
I20260504 14:07:24.261610 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance
I20260504 14:07:24.261869 26619 external_mini_cluster.cc:1468] Setting key 08e041770da6b68a2b186af763eae27a
May 04 14:07:24 dist-test-slave-2x32 krb5kdc[27153](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903644, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:24.268229 27240 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:24.255769 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:44601 (local address 127.25.254.254:43049)
0504 14:07:24.255953 (+   184us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:24.255957 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:24.256715 (+   758us) server_negotiation.cc:408] Connection header received
0504 14:07:24.257624 (+   909us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:24.257627 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:24.257678 (+    51us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:24.257765 (+    87us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:24.259435 (+  1670us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.259959 (+   524us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:24.260690 (+   731us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.260854 (+   164us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:24.263552 (+  2698us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:24.263571 (+    19us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:24.263574 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:24.263604 (+    30us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:24.265425 (+  1821us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:24.266277 (+   852us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:24.266282 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:24.266284 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:24.266353 (+    69us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:24.266783 (+   430us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:24.266787 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:24.266788 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:24.267033 (+   245us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:24.267145 (+   112us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:24.267972 (+   827us) server_negotiation.cc:300] Negotiation successful
0504 14:07:24.268102 (+   130us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":76}
I20260504 14:07:24.269273 27396 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:24.256047 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:43049 (local address 127.25.254.193:44601)
0504 14:07:24.256514 (+   467us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:24.256560 (+    46us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:24.257426 (+   866us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:24.257927 (+   501us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:24.257936 (+     9us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:24.258361 (+   425us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:24.259262 (+   901us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:24.259277 (+    15us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.260107 (+   830us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:24.260111 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:24.260530 (+   419us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:24.260537 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.260736 (+   199us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:24.261626 (+   890us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:24.261665 (+    39us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:24.263403 (+  1738us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:24.265605 (+  2202us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:24.265614 (+     9us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:24.265631 (+    17us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:24.266027 (+   396us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:24.266476 (+   449us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:24.266483 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:24.266488 (+     5us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:24.266669 (+   181us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:24.267173 (+   504us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:24.267183 (+    10us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:24.267634 (+   451us) client_negotiation.cc:770] Sending connection context
0504 14:07:24.267917 (+   283us) client_negotiation.cc:241] Negotiation successful
0504 14:07:24.268209 (+   292us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":277,"thread_start_us":143,"threads_started":1}
I20260504 14:07:24.270613 27394 heartbeater.cc:344] Connected to a master server at 127.25.254.254:43049
I20260504 14:07:24.270900 27394 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:24.271520 27394 heartbeater.cc:507] Master 127.25.254.254:43049 requested a full tablet report, sending...
WARNING: no policy specified for kudu/127.25.254.194@KRBTEST.COM; defaulting to no policy
I20260504 14:07:24.273145 27201 ts_manager.cc:194] Registered new tserver with Master: 03e0ce290a944dfd86e05ce3a2cae436 (127.25.254.193:38655)
I20260504 14:07:24.274426 27201 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.193@KRBTEST.COM'} at 127.25.254.193:44601
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:24.322515 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:0
--local_ip_for_outbound_sockets=127.25.254.194
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:43049
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:34293
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--tserver_enforce_access_control with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:24.431587 27401 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:24.431843 27401 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:24.431911 27401 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:24.435719 27401 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:24.435792 27401 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:24.435940 27401 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:07:24.440482 27401 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:34293
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-1/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:43049
--tserver_enforce_access_control=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.27401
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:24.441689 27401 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:24.442765 27401 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:24.449775 27406 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:24.449788 27407 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:24.449996 27401 server_base.cc:1061] running on GCE node
W20260504 14:07:24.450131 27409 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:24.450757 27401 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:24.451548 27401 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:24.452754 27401 hybrid_clock.cc:648] HybridClock initialized: now 1777903644452722 us; error 45 us; skew 500 ppm
May 04 14:07:24 dist-test-slave-2x32 krb5kdc[27153](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903644, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:24.455848 27401 init.cc:377] Logged in from keytab as kudu/127.25.254.194@KRBTEST.COM (short username kudu)
I20260504 14:07:24.456972 27401 webserver.cc:492] Webserver started at http://127.25.254.194:39569/ using document root <none> and password file <none>
I20260504 14:07:24.457587 27401 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:24.457641 27401 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:24.457857 27401 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:24.459702 27401 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-1/data/instance:
uuid: "491ae8d024c84d2ea5f6b97a34cc88fc"
format_stamp: "Formatted at 2026-05-04 14:07:24 on dist-test-slave-2x32"
server_key: "7a74439413a531f7c6f5244686fdd392"
server_key_iv: "d166966188c754f04d71645d3a685dd7"
server_key_version: "encryptionkey@0"
I20260504 14:07:24.460196 27401 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance:
uuid: "491ae8d024c84d2ea5f6b97a34cc88fc"
format_stamp: "Formatted at 2026-05-04 14:07:24 on dist-test-slave-2x32"
server_key: "7a74439413a531f7c6f5244686fdd392"
server_key_iv: "d166966188c754f04d71645d3a685dd7"
server_key_version: "encryptionkey@0"
I20260504 14:07:24.463742 27401 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.005s	sys 0.000s
I20260504 14:07:24.466043 27416 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:24.467134 27401 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:07:24.467286 27401 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "491ae8d024c84d2ea5f6b97a34cc88fc"
format_stamp: "Formatted at 2026-05-04 14:07:24 on dist-test-slave-2x32"
server_key: "7a74439413a531f7c6f5244686fdd392"
server_key_iv: "d166966188c754f04d71645d3a685dd7"
server_key_version: "encryptionkey@0"
I20260504 14:07:24.467428 27401 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:24.501420 27401 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:24.505185 27401 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:24.505437 27401 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:24.506067 27401 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:24.507071 27401 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:24.507120 27401 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:24.507192 27401 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:24.507226 27401 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:24.517288 27401 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:35713
I20260504 14:07:24.517297 27529 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:35713 every 8 connection(s)
I20260504 14:07:24.518405 27401 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
I20260504 14:07:24.519649 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 27401
I20260504 14:07:24.519738 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance
I20260504 14:07:24.520005 26619 external_mini_cluster.cc:1468] Setting key 505e69be398f1bddecdf0e6cacd7f9b8
May 04 14:07:24 dist-test-slave-2x32 krb5kdc[27153](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903644, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
WARNING: no policy specified for kudu/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.195@KRBTEST.COM" created.
I20260504 14:07:24.533435 27240 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:24.520471 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:60987 (local address 127.25.254.254:43049)
0504 14:07:24.520659 (+   188us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:24.520663 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:24.521510 (+   847us) server_negotiation.cc:408] Connection header received
0504 14:07:24.522911 (+  1401us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:24.522914 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:24.522964 (+    50us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:24.523041 (+    77us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:24.524882 (+  1841us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.525748 (+   866us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:24.526760 (+  1012us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.527012 (+   252us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:24.529393 (+  2381us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:24.529420 (+    27us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:24.529424 (+     4us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:24.529455 (+    31us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:24.531039 (+  1584us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:24.531607 (+   568us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:24.531611 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:24.531612 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:24.531666 (+    54us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:24.532126 (+   460us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:24.532129 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:24.532130 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:24.532331 (+   201us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:24.532424 (+    93us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:24.533178 (+   754us) server_negotiation.cc:300] Negotiation successful
0504 14:07:24.533299 (+   121us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":70}
I20260504 14:07:24.534373 27532 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:24.520823 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:43049 (local address 127.25.254.194:60987)
0504 14:07:24.521378 (+   555us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:24.521426 (+    48us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:24.522639 (+  1213us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:24.523260 (+   621us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:24.523275 (+    15us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:24.523825 (+   550us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:24.524705 (+   880us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:24.524719 (+    14us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.525910 (+  1191us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:24.525913 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:24.526631 (+   718us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:24.526639 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.526811 (+   172us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:24.527469 (+   658us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:24.527497 (+    28us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:24.529224 (+  1727us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:24.531169 (+  1945us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:24.531174 (+     5us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:24.531193 (+    19us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:24.531452 (+   259us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:24.531813 (+   361us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:24.531819 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:24.531831 (+    12us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:24.532002 (+   171us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:24.532603 (+   601us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:24.532613 (+    10us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:24.532881 (+   268us) client_negotiation.cc:770] Sending connection context
0504 14:07:24.533114 (+   233us) client_negotiation.cc:241] Negotiation successful
0504 14:07:24.533392 (+   278us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":330,"thread_start_us":153,"threads_started":1}
I20260504 14:07:24.535668 27530 heartbeater.cc:344] Connected to a master server at 127.25.254.254:43049
I20260504 14:07:24.535944 27530 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:24.536468 27530 heartbeater.cc:507] Master 127.25.254.254:43049 requested a full tablet report, sending...
I20260504 14:07:24.537782 27201 ts_manager.cc:194] Registered new tserver with Master: 491ae8d024c84d2ea5f6b97a34cc88fc (127.25.254.194:35713)
I20260504 14:07:24.538538 27201 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.194@KRBTEST.COM'} at 127.25.254.194:60987
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:24.579686 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:0
--local_ip_for_outbound_sockets=127.25.254.195
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:43049
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:34293
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--tserver_enforce_access_control with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:24.687186 27537 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:24.687461 27537 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:24.687575 27537 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:24.691602 27537 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:24.691711 27537 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:24.691835 27537 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:07:24.696568 27537 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:34293
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-2/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:43049
--tserver_enforce_access_control=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.27537
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:24.697769 27537 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:24.698721 27537 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:24.705688 27542 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:24.705693 27543 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:24.705721 27545 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:24.706074 27537 server_base.cc:1061] running on GCE node
I20260504 14:07:24.706537 27537 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:24.707163 27537 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:24.708374 27537 hybrid_clock.cc:648] HybridClock initialized: now 1777903644708349 us; error 59 us; skew 500 ppm
May 04 14:07:24 dist-test-slave-2x32 krb5kdc[27153](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903644, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:24.711311 27537 init.cc:377] Logged in from keytab as kudu/127.25.254.195@KRBTEST.COM (short username kudu)
I20260504 14:07:24.712448 27537 webserver.cc:492] Webserver started at http://127.25.254.195:36093/ using document root <none> and password file <none>
I20260504 14:07:24.713063 27537 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:24.713141 27537 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:24.713359 27537 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:24.715270 27537 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-2/data/instance:
uuid: "219e503064c842a0b2aafba76c04d633"
format_stamp: "Formatted at 2026-05-04 14:07:24 on dist-test-slave-2x32"
server_key: "26c60b9b3594d62ce2413338d6a7d50f"
server_key_iv: "be7d7098ee4afda7e1149728739ec681"
server_key_version: "encryptionkey@0"
I20260504 14:07:24.715788 27537 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance:
uuid: "219e503064c842a0b2aafba76c04d633"
format_stamp: "Formatted at 2026-05-04 14:07:24 on dist-test-slave-2x32"
server_key: "26c60b9b3594d62ce2413338d6a7d50f"
server_key_iv: "be7d7098ee4afda7e1149728739ec681"
server_key_version: "encryptionkey@0"
I20260504 14:07:24.719393 27537 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.004s	sys 0.000s
I20260504 14:07:24.721802 27552 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:24.723296 27537 fs_manager.cc:730] Time spent opening block manager: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:07:24.723444 27537 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "219e503064c842a0b2aafba76c04d633"
format_stamp: "Formatted at 2026-05-04 14:07:24 on dist-test-slave-2x32"
server_key: "26c60b9b3594d62ce2413338d6a7d50f"
server_key_iv: "be7d7098ee4afda7e1149728739ec681"
server_key_version: "encryptionkey@0"
I20260504 14:07:24.723557 27537 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:24.736833 27537 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:24.740017 27537 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:24.740259 27537 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:24.740929 27537 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:24.741895 27537 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:24.741967 27537 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:24.742039 27537 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:24.742081 27537 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:24.751752 27537 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:41845
I20260504 14:07:24.751819 27665 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:41845 every 8 connection(s)
I20260504 14:07:24.752799 27537 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
I20260504 14:07:24.756132 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 27537
I20260504 14:07:24.756255 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizationOnChecksum.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance
I20260504 14:07:24.756534 26619 external_mini_cluster.cc:1468] Setting key 0cec21b11fbefc06c86b1912fc8dff25
May 04 14:07:24 dist-test-slave-2x32 krb5kdc[27153](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903644, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:24.765697 27240 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:24.754752 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:39897 (local address 127.25.254.254:43049)
0504 14:07:24.754897 (+   145us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:24.754901 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:24.755412 (+   511us) server_negotiation.cc:408] Connection header received
0504 14:07:24.756511 (+  1099us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:24.756514 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:24.756565 (+    51us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:24.756728 (+   163us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:24.758586 (+  1858us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.759183 (+   597us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:24.759879 (+   696us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.760054 (+   175us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:24.762411 (+  2357us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:24.762429 (+    18us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:24.762432 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:24.762458 (+    26us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:24.763894 (+  1436us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:24.764384 (+   490us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:24.764388 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:24.764389 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:24.764437 (+    48us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:24.764759 (+   322us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:24.764762 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:24.764763 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:24.764912 (+   149us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:24.764994 (+    82us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:24.765465 (+   471us) server_negotiation.cc:300] Negotiation successful
0504 14:07:24.765568 (+   103us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":52}
I20260504 14:07:24.766439 27668 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:24.754835 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:43049 (local address 127.25.254.195:39897)
0504 14:07:24.755262 (+   427us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:24.755298 (+    36us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:24.756276 (+   978us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:24.756922 (+   646us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:24.756935 (+    13us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:24.757345 (+   410us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:24.758434 (+  1089us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:24.758448 (+    14us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.759308 (+   860us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:24.759312 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:24.759747 (+   435us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:24.759754 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.759969 (+   215us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:24.760605 (+   636us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:24.760625 (+    20us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:24.762230 (+  1605us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:24.764011 (+  1781us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:24.764017 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:24.764027 (+    10us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:24.764260 (+   233us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:24.764542 (+   282us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:24.764545 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:24.764547 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:24.764651 (+   104us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:24.765010 (+   359us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:24.765014 (+     4us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:24.765229 (+   215us) client_negotiation.cc:770] Sending connection context
0504 14:07:24.765418 (+   189us) client_negotiation.cc:241] Negotiation successful
0504 14:07:24.765644 (+   226us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":255,"thread_start_us":105,"threads_started":1}
I20260504 14:07:24.767620 27666 heartbeater.cc:344] Connected to a master server at 127.25.254.254:43049
I20260504 14:07:24.767891 27666 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:24.768416 27666 heartbeater.cc:507] Master 127.25.254.254:43049 requested a full tablet report, sending...
I20260504 14:07:24.769701 27201 ts_manager.cc:194] Registered new tserver with Master: 219e503064c842a0b2aafba76c04d633 (127.25.254.195:41845)
I20260504 14:07:24.770375 27201 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.195@KRBTEST.COM'} at 127.25.254.195:39897
I20260504 14:07:24.770823 26619 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
I20260504 14:07:24.780982 27240 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:24.774001 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:54166 (local address 127.25.254.254:43049)
0504 14:07:24.774235 (+   234us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:24.774240 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:24.774409 (+   169us) server_negotiation.cc:408] Connection header received
0504 14:07:24.774508 (+    99us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:24.774513 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:24.774561 (+    48us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:24.774654 (+    93us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:24.775525 (+   871us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.776007 (+   482us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:24.776685 (+   678us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.776892 (+   207us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:24.777997 (+  1105us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:24.778017 (+    20us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:24.778022 (+     5us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:24.778053 (+    31us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:24.779412 (+  1359us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:24.779825 (+   413us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:24.779829 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:24.779831 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:24.779882 (+    51us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:24.780122 (+   240us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:24.780125 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:24.780127 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:24.780303 (+   176us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:24.780458 (+   155us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:24.780700 (+   242us) server_negotiation.cc:300] Negotiation successful
0504 14:07:24.780811 (+   111us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":56}
I20260504 14:07:24.785809 27201 catalog_manager.cc:2257] Servicing CreateTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:54166:
name: "test-table"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "val"
    type: INT32
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
W20260504 14:07:24.788420 27201 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-table in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260504 14:07:24.802222 27677 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:24.797437 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:41845 (local address 127.0.0.1:56848)
0504 14:07:24.798062 (+   625us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:24.798101 (+    39us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:24.798266 (+   165us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:24.799016 (+   750us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:24.799023 (+     7us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:24.799050 (+    27us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:24.799392 (+   342us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:24.799405 (+    13us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.800787 (+  1382us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:24.800793 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:24.801669 (+   876us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:24.801677 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.801797 (+   120us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:24.801858 (+    61us) client_negotiation.cc:770] Sending connection context
0504 14:07:24.801951 (+    93us) client_negotiation.cc:241] Negotiation successful
0504 14:07:24.802024 (+    73us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":460,"thread_start_us":82,"threads_started":1}
I20260504 14:07:24.802680 27680 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:24.798147 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:38655 (local address 127.0.0.1:37512)
0504 14:07:24.798604 (+   457us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:24.798616 (+    12us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:24.798683 (+    67us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:24.799395 (+   712us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:24.799398 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:24.799412 (+    14us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:24.799916 (+   504us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:24.799924 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.801398 (+  1474us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:24.801402 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:24.802301 (+   899us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:24.802322 (+    21us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.802437 (+   115us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:24.802455 (+    18us) client_negotiation.cc:770] Sending connection context
0504 14:07:24.802506 (+    51us) client_negotiation.cc:241] Negotiation successful
0504 14:07:24.802556 (+    50us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":352,"thread_start_us":106,"threads_started":1}
I20260504 14:07:24.802834 27678 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:24.797621 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:56848 (local address 127.25.254.195:41845)
0504 14:07:24.798413 (+   792us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:24.798422 (+     9us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:24.798450 (+    28us) server_negotiation.cc:408] Connection header received
0504 14:07:24.798549 (+    99us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:24.798555 (+     6us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:24.798775 (+   220us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:24.799684 (+   909us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:24.799880 (+   196us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.800656 (+   776us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:24.801837 (+  1181us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.802375 (+   538us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:24.802505 (+   130us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:24.802575 (+    70us) server_negotiation.cc:300] Negotiation successful
0504 14:07:24.802669 (+    94us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":595,"thread_start_us":156,"threads_started":1}
I20260504 14:07:24.803589 27679 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:24.797784 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:35713 (local address 127.0.0.1:56440)
0504 14:07:24.798527 (+   743us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:24.798541 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:24.798620 (+    79us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:24.799344 (+   724us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:24.799362 (+    18us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:24.799387 (+    25us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:24.799648 (+   261us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:24.799665 (+    17us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.801472 (+  1807us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:24.801476 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:24.803251 (+  1775us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:24.803260 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.803362 (+   102us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:24.803375 (+    13us) client_negotiation.cc:770] Sending connection context
0504 14:07:24.803420 (+    45us) client_negotiation.cc:241] Negotiation successful
0504 14:07:24.803471 (+    51us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":638,"thread_start_us":102,"threads_started":1}
I20260504 14:07:24.803624 27682 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:24.798736 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:37512 (local address 127.25.254.193:38655)
0504 14:07:24.799038 (+   302us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:24.799043 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:24.799061 (+    18us) server_negotiation.cc:408] Connection header received
0504 14:07:24.799129 (+    68us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:24.799133 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:24.799265 (+   132us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:24.799551 (+   286us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:24.800118 (+   567us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.801264 (+  1146us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:24.802863 (+  1599us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.803319 (+   456us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:24.803396 (+    77us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:24.803460 (+    64us) server_negotiation.cc:300] Negotiation successful
0504 14:07:24.803517 (+    57us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":208,"thread_start_us":117,"threads_started":1}
I20260504 14:07:24.804762 27681 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:24.798361 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:56440 (local address 127.25.254.194:35713)
0504 14:07:24.798883 (+   522us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:24.798890 (+     7us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:24.798913 (+    23us) server_negotiation.cc:408] Connection header received
0504 14:07:24.798997 (+    84us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:24.799002 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:24.799181 (+   179us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:24.800104 (+   923us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:24.800254 (+   150us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.801378 (+  1124us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:24.803872 (+  2494us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.804381 (+   509us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:24.804470 (+    89us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:24.804535 (+    65us) server_negotiation.cc:300] Negotiation successful
0504 14:07:24.804624 (+    89us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":415,"thread_start_us":156,"threads_started":1}
I20260504 14:07:24.805958 27600 tablet_service.cc:1511] Processing CreateTablet for tablet 216fde87eab448c0aceb5924838926c4 (DEFAULT_TABLE table=test-table [id=685c001bb53443c6b77187180ff64e5c]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:24.806285 27328 tablet_service.cc:1511] Processing CreateTablet for tablet 216fde87eab448c0aceb5924838926c4 (DEFAULT_TABLE table=test-table [id=685c001bb53443c6b77187180ff64e5c]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:24.807046 27600 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 216fde87eab448c0aceb5924838926c4. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:24.807199 27328 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 216fde87eab448c0aceb5924838926c4. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:24.807452 27464 tablet_service.cc:1511] Processing CreateTablet for tablet 216fde87eab448c0aceb5924838926c4 (DEFAULT_TABLE table=test-table [id=685c001bb53443c6b77187180ff64e5c]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:24.808295 27464 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 216fde87eab448c0aceb5924838926c4. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:24.812436 27683 tablet_bootstrap.cc:492] T 216fde87eab448c0aceb5924838926c4 P 03e0ce290a944dfd86e05ce3a2cae436: Bootstrap starting.
I20260504 14:07:24.813680 27684 tablet_bootstrap.cc:492] T 216fde87eab448c0aceb5924838926c4 P 219e503064c842a0b2aafba76c04d633: Bootstrap starting.
I20260504 14:07:24.814709 27683 tablet_bootstrap.cc:654] T 216fde87eab448c0aceb5924838926c4 P 03e0ce290a944dfd86e05ce3a2cae436: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:24.815469 27683 log.cc:826] T 216fde87eab448c0aceb5924838926c4 P 03e0ce290a944dfd86e05ce3a2cae436: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:24.815672 27684 tablet_bootstrap.cc:654] T 216fde87eab448c0aceb5924838926c4 P 219e503064c842a0b2aafba76c04d633: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:24.816120 27685 tablet_bootstrap.cc:492] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc: Bootstrap starting.
I20260504 14:07:24.816485 27684 log.cc:826] T 216fde87eab448c0aceb5924838926c4 P 219e503064c842a0b2aafba76c04d633: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:24.817432 27683 tablet_bootstrap.cc:492] T 216fde87eab448c0aceb5924838926c4 P 03e0ce290a944dfd86e05ce3a2cae436: No bootstrap required, opened a new log
I20260504 14:07:24.817646 27683 ts_tablet_manager.cc:1403] T 216fde87eab448c0aceb5924838926c4 P 03e0ce290a944dfd86e05ce3a2cae436: Time spent bootstrapping tablet: real 0.005s	user 0.004s	sys 0.000s
I20260504 14:07:24.818051 27685 tablet_bootstrap.cc:654] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:24.818442 27684 tablet_bootstrap.cc:492] T 216fde87eab448c0aceb5924838926c4 P 219e503064c842a0b2aafba76c04d633: No bootstrap required, opened a new log
I20260504 14:07:24.818634 27684 ts_tablet_manager.cc:1403] T 216fde87eab448c0aceb5924838926c4 P 219e503064c842a0b2aafba76c04d633: Time spent bootstrapping tablet: real 0.005s	user 0.004s	sys 0.000s
I20260504 14:07:24.818838 27685 log.cc:826] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:24.820446 27685 tablet_bootstrap.cc:492] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc: No bootstrap required, opened a new log
I20260504 14:07:24.820667 27685 ts_tablet_manager.cc:1403] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc: Time spent bootstrapping tablet: real 0.005s	user 0.004s	sys 0.000s
I20260504 14:07:24.820958 27683 raft_consensus.cc:359] T 216fde87eab448c0aceb5924838926c4 P 03e0ce290a944dfd86e05ce3a2cae436 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "219e503064c842a0b2aafba76c04d633" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 41845 } } peers { permanent_uuid: "491ae8d024c84d2ea5f6b97a34cc88fc" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35713 } } peers { permanent_uuid: "03e0ce290a944dfd86e05ce3a2cae436" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38655 } }
I20260504 14:07:24.821334 27683 raft_consensus.cc:385] T 216fde87eab448c0aceb5924838926c4 P 03e0ce290a944dfd86e05ce3a2cae436 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:24.821442 27683 raft_consensus.cc:740] T 216fde87eab448c0aceb5924838926c4 P 03e0ce290a944dfd86e05ce3a2cae436 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 03e0ce290a944dfd86e05ce3a2cae436, State: Initialized, Role: FOLLOWER
I20260504 14:07:24.821979 27683 consensus_queue.cc:260] T 216fde87eab448c0aceb5924838926c4 P 03e0ce290a944dfd86e05ce3a2cae436 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "219e503064c842a0b2aafba76c04d633" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 41845 } } peers { permanent_uuid: "491ae8d024c84d2ea5f6b97a34cc88fc" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35713 } } peers { permanent_uuid: "03e0ce290a944dfd86e05ce3a2cae436" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38655 } }
I20260504 14:07:24.822371 27684 raft_consensus.cc:359] T 216fde87eab448c0aceb5924838926c4 P 219e503064c842a0b2aafba76c04d633 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "219e503064c842a0b2aafba76c04d633" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 41845 } } peers { permanent_uuid: "491ae8d024c84d2ea5f6b97a34cc88fc" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35713 } } peers { permanent_uuid: "03e0ce290a944dfd86e05ce3a2cae436" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38655 } }
I20260504 14:07:24.822597 27684 raft_consensus.cc:385] T 216fde87eab448c0aceb5924838926c4 P 219e503064c842a0b2aafba76c04d633 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:24.822665 27684 raft_consensus.cc:740] T 216fde87eab448c0aceb5924838926c4 P 219e503064c842a0b2aafba76c04d633 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 219e503064c842a0b2aafba76c04d633, State: Initialized, Role: FOLLOWER
I20260504 14:07:24.823060 27683 ts_tablet_manager.cc:1434] T 216fde87eab448c0aceb5924838926c4 P 03e0ce290a944dfd86e05ce3a2cae436: Time spent starting tablet: real 0.005s	user 0.006s	sys 0.000s
I20260504 14:07:24.823093 27684 consensus_queue.cc:260] T 216fde87eab448c0aceb5924838926c4 P 219e503064c842a0b2aafba76c04d633 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "219e503064c842a0b2aafba76c04d633" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 41845 } } peers { permanent_uuid: "491ae8d024c84d2ea5f6b97a34cc88fc" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35713 } } peers { permanent_uuid: "03e0ce290a944dfd86e05ce3a2cae436" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38655 } }
I20260504 14:07:24.823460 27394 heartbeater.cc:499] Master 127.25.254.254:43049 was elected leader, sending a full tablet report...
I20260504 14:07:24.823940 27684 ts_tablet_manager.cc:1434] T 216fde87eab448c0aceb5924838926c4 P 219e503064c842a0b2aafba76c04d633: Time spent starting tablet: real 0.005s	user 0.005s	sys 0.000s
I20260504 14:07:24.824282 27666 heartbeater.cc:499] Master 127.25.254.254:43049 was elected leader, sending a full tablet report...
I20260504 14:07:24.824484 27685 raft_consensus.cc:359] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "219e503064c842a0b2aafba76c04d633" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 41845 } } peers { permanent_uuid: "491ae8d024c84d2ea5f6b97a34cc88fc" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35713 } } peers { permanent_uuid: "03e0ce290a944dfd86e05ce3a2cae436" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38655 } }
I20260504 14:07:24.824735 27685 raft_consensus.cc:385] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:24.824822 27685 raft_consensus.cc:740] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 491ae8d024c84d2ea5f6b97a34cc88fc, State: Initialized, Role: FOLLOWER
I20260504 14:07:24.825202 27685 consensus_queue.cc:260] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "219e503064c842a0b2aafba76c04d633" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 41845 } } peers { permanent_uuid: "491ae8d024c84d2ea5f6b97a34cc88fc" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35713 } } peers { permanent_uuid: "03e0ce290a944dfd86e05ce3a2cae436" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38655 } }
I20260504 14:07:24.826112 27685 ts_tablet_manager.cc:1434] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc: Time spent starting tablet: real 0.005s	user 0.005s	sys 0.000s
I20260504 14:07:24.826320 27530 heartbeater.cc:499] Master 127.25.254.254:43049 was elected leader, sending a full tablet report...
I20260504 14:07:24.985772 27691 raft_consensus.cc:493] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:07:24.986009 27691 raft_consensus.cc:515] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "219e503064c842a0b2aafba76c04d633" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 41845 } } peers { permanent_uuid: "491ae8d024c84d2ea5f6b97a34cc88fc" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35713 } } peers { permanent_uuid: "03e0ce290a944dfd86e05ce3a2cae436" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38655 } }
I20260504 14:07:24.987354 27691 leader_election.cc:290] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 219e503064c842a0b2aafba76c04d633 (127.25.254.195:41845), 03e0ce290a944dfd86e05ce3a2cae436 (127.25.254.193:38655)
I20260504 14:07:24.990540 27532 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:24.987561 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:41845 (local address 127.25.254.194:48649)
0504 14:07:24.987719 (+   158us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:24.987737 (+    18us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:24.987830 (+    93us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:24.988192 (+   362us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:24.988195 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:24.988217 (+    22us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:24.988515 (+   298us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:24.988527 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.989480 (+   953us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:24.989485 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:24.990109 (+   624us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:24.990117 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.990277 (+   160us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:24.990295 (+    18us) client_negotiation.cc:770] Sending connection context
0504 14:07:24.990348 (+    53us) client_negotiation.cc:241] Negotiation successful
0504 14:07:24.990407 (+    59us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":53}
I20260504 14:07:24.990779 27692 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:24.987822 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:38655 (local address 127.25.254.194:42847)
0504 14:07:24.988196 (+   374us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:24.988210 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:24.988299 (+    89us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:24.988630 (+   331us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:24.988633 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:24.988656 (+    23us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:24.988954 (+   298us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:24.988961 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.989685 (+   724us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:24.989691 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:24.990375 (+   684us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:24.990382 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.990506 (+   124us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:24.990522 (+    16us) client_negotiation.cc:770] Sending connection context
0504 14:07:24.990572 (+    50us) client_negotiation.cc:241] Negotiation successful
0504 14:07:24.990622 (+    50us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":286,"thread_start_us":107,"threads_started":1}
I20260504 14:07:24.991034 27678 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:24.987710 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:48649 (local address 127.25.254.195:41845)
0504 14:07:24.987855 (+   145us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:24.987860 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:24.987875 (+    15us) server_negotiation.cc:408] Connection header received
0504 14:07:24.987926 (+    51us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:24.987931 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:24.987986 (+    55us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:24.988070 (+    84us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:24.988700 (+   630us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.989361 (+   661us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:24.990260 (+   899us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.990768 (+   508us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:24.990815 (+    47us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:24.990863 (+    48us) server_negotiation.cc:300] Negotiation successful
0504 14:07:24.990911 (+    48us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":60}
I20260504 14:07:24.991292 27682 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:24.988090 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:42847 (local address 127.25.254.193:38655)
0504 14:07:24.988255 (+   165us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:24.988259 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:24.988275 (+    16us) server_negotiation.cc:408] Connection header received
0504 14:07:24.988430 (+   155us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:24.988433 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:24.988508 (+    75us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:24.988596 (+    88us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:24.989064 (+   468us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.989538 (+   474us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:24.990497 (+   959us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:24.990993 (+   496us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:24.991034 (+    41us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:24.991089 (+    55us) server_negotiation.cc:300] Negotiation successful
0504 14:07:24.991151 (+    62us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":65}
I20260504 14:07:24.991670 27620 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "216fde87eab448c0aceb5924838926c4" candidate_uuid: "491ae8d024c84d2ea5f6b97a34cc88fc" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "219e503064c842a0b2aafba76c04d633" is_pre_election: true
I20260504 14:07:24.991937 27620 raft_consensus.cc:2468] T 216fde87eab448c0aceb5924838926c4 P 219e503064c842a0b2aafba76c04d633 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 491ae8d024c84d2ea5f6b97a34cc88fc in term 0.
I20260504 14:07:24.992096 27348 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "216fde87eab448c0aceb5924838926c4" candidate_uuid: "491ae8d024c84d2ea5f6b97a34cc88fc" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "03e0ce290a944dfd86e05ce3a2cae436" is_pre_election: true
I20260504 14:07:24.992413 27419 leader_election.cc:304] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 219e503064c842a0b2aafba76c04d633, 491ae8d024c84d2ea5f6b97a34cc88fc; no voters: 
I20260504 14:07:24.992520 27348 raft_consensus.cc:2468] T 216fde87eab448c0aceb5924838926c4 P 03e0ce290a944dfd86e05ce3a2cae436 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 491ae8d024c84d2ea5f6b97a34cc88fc in term 0.
I20260504 14:07:24.992657 27691 raft_consensus.cc:2804] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:07:24.992743 27691 raft_consensus.cc:493] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:07:24.992805 27691 raft_consensus.cc:3060] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:24.993953 27691 raft_consensus.cc:515] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "219e503064c842a0b2aafba76c04d633" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 41845 } } peers { permanent_uuid: "491ae8d024c84d2ea5f6b97a34cc88fc" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35713 } } peers { permanent_uuid: "03e0ce290a944dfd86e05ce3a2cae436" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38655 } }
I20260504 14:07:24.994375 27691 leader_election.cc:290] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc [CANDIDATE]: Term 1 election: Requested vote from peers 219e503064c842a0b2aafba76c04d633 (127.25.254.195:41845), 03e0ce290a944dfd86e05ce3a2cae436 (127.25.254.193:38655)
I20260504 14:07:24.994901 27620 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "216fde87eab448c0aceb5924838926c4" candidate_uuid: "491ae8d024c84d2ea5f6b97a34cc88fc" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "219e503064c842a0b2aafba76c04d633"
I20260504 14:07:24.994978 27348 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "216fde87eab448c0aceb5924838926c4" candidate_uuid: "491ae8d024c84d2ea5f6b97a34cc88fc" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "03e0ce290a944dfd86e05ce3a2cae436"
I20260504 14:07:24.995031 27620 raft_consensus.cc:3060] T 216fde87eab448c0aceb5924838926c4 P 219e503064c842a0b2aafba76c04d633 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:24.995090 27348 raft_consensus.cc:3060] T 216fde87eab448c0aceb5924838926c4 P 03e0ce290a944dfd86e05ce3a2cae436 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:24.996233 27620 raft_consensus.cc:2468] T 216fde87eab448c0aceb5924838926c4 P 219e503064c842a0b2aafba76c04d633 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 491ae8d024c84d2ea5f6b97a34cc88fc in term 1.
I20260504 14:07:24.996233 27348 raft_consensus.cc:2468] T 216fde87eab448c0aceb5924838926c4 P 03e0ce290a944dfd86e05ce3a2cae436 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 491ae8d024c84d2ea5f6b97a34cc88fc in term 1.
I20260504 14:07:24.996543 27419 leader_election.cc:304] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 219e503064c842a0b2aafba76c04d633, 491ae8d024c84d2ea5f6b97a34cc88fc; no voters: 
I20260504 14:07:24.996724 27691 raft_consensus.cc:2804] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:24.996950 27691 raft_consensus.cc:697] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc [term 1 LEADER]: Becoming Leader. State: Replica: 491ae8d024c84d2ea5f6b97a34cc88fc, State: Running, Role: LEADER
I20260504 14:07:24.997267 27691 consensus_queue.cc:237] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "219e503064c842a0b2aafba76c04d633" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 41845 } } peers { permanent_uuid: "491ae8d024c84d2ea5f6b97a34cc88fc" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35713 } } peers { permanent_uuid: "03e0ce290a944dfd86e05ce3a2cae436" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38655 } }
I20260504 14:07:25.000725 27200 catalog_manager.cc:5671] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc reported cstate change: term changed from 0 to 1, leader changed from <none> to 491ae8d024c84d2ea5f6b97a34cc88fc (127.25.254.194). New cstate: current_term: 1 leader_uuid: "491ae8d024c84d2ea5f6b97a34cc88fc" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "219e503064c842a0b2aafba76c04d633" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 41845 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "491ae8d024c84d2ea5f6b97a34cc88fc" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35713 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "03e0ce290a944dfd86e05ce3a2cae436" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38655 } health_report { overall_health: UNKNOWN } } }
W20260504 14:07:25.003734 27667 tablet.cc:2404] T 216fde87eab448c0aceb5924838926c4 P 219e503064c842a0b2aafba76c04d633: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20260504 14:07:25.005028 27395 tablet.cc:2404] T 216fde87eab448c0aceb5924838926c4 P 03e0ce290a944dfd86e05ce3a2cae436: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20260504 14:07:25.019677 27531 tablet.cc:2404] T 216fde87eab448c0aceb5924838926c4 P 491ae8d024c84d2ea5f6b97a34cc88fc: Can't schedule compaction. Clean time has not been advanced past its initial value.
May 04 14:07:25 dist-test-slave-2x32 krb5kdc[27153](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903643, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.193@KRBTEST.COM
I20260504 14:07:25.045127 27682 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:25.035905 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:37514 (local address 127.25.254.193:38655)
0504 14:07:25.036090 (+   185us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:25.036095 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:25.036112 (+    17us) server_negotiation.cc:408] Connection header received
0504 14:07:25.036374 (+   262us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:25.036381 (+     7us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:25.036434 (+    53us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:25.036516 (+    82us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:25.037396 (+   880us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:25.037892 (+   496us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:25.038687 (+   795us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:25.038835 (+   148us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:25.041562 (+  2727us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:25.041592 (+    30us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:25.041605 (+    13us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:25.041640 (+    35us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:25.043446 (+  1806us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:25.043931 (+   485us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:25.043936 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:25.043938 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:25.043987 (+    49us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:25.044222 (+   235us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:25.044224 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:25.044226 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:25.044437 (+   211us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:25.044585 (+   148us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:25.044758 (+   173us) server_negotiation.cc:300] Negotiation successful
0504 14:07:25.044949 (+   191us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":85}
May 04 14:07:25 dist-test-slave-2x32 krb5kdc[27153](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903645, etypes {rep=17 tkt=17 ses=17}, test-user@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-user@KRBTEST.COM: 
May 04 14:07:25 dist-test-slave-2x32 krb5kdc[27153](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903645, etypes {rep=17 tkt=17 ses=17}, test-user@KRBTEST.COM for kudu/127.25.254.193@KRBTEST.COM
I20260504 14:07:25.071643 27682 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:25.063196 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:37520 (local address 127.25.254.193:38655)
0504 14:07:25.063406 (+   210us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:25.063410 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:25.063531 (+   121us) server_negotiation.cc:408] Connection header received
0504 14:07:25.063736 (+   205us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:25.063740 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:25.063794 (+    54us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:25.063882 (+    88us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:25.064721 (+   839us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:25.065198 (+   477us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:25.065868 (+   670us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:25.066102 (+   234us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:25.068507 (+  2405us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:25.068533 (+    26us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:25.068536 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:25.068571 (+    35us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:25.070038 (+  1467us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:25.070549 (+   511us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:25.070558 (+     9us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:25.070559 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:25.070608 (+    49us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:25.070818 (+   210us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:25.070821 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:25.070822 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:25.070952 (+   130us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:25.071048 (+    96us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:25.071374 (+   326us) server_negotiation.cc:300] Negotiation successful
0504 14:07:25.071502 (+   128us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":89}
May 04 14:07:25 dist-test-slave-2x32 krb5kdc[27153](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903645, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
May 04 14:07:25 dist-test-slave-2x32 krb5kdc[27153](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903645, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.193@KRBTEST.COM
I20260504 14:07:25.097956 27682 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:25.089458 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:37530 (local address 127.25.254.193:38655)
0504 14:07:25.089704 (+   246us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:25.089707 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:25.089724 (+    17us) server_negotiation.cc:408] Connection header received
0504 14:07:25.089880 (+   156us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:25.089883 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:25.089932 (+    49us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:25.090008 (+    76us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:25.091202 (+  1194us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:25.091724 (+   522us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:25.092429 (+   705us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:25.092617 (+   188us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:25.095046 (+  2429us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:25.095069 (+    23us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:25.095072 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:25.095102 (+    30us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:25.096416 (+  1314us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:25.096810 (+   394us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:25.096813 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:25.096814 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:25.096859 (+    45us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:25.097122 (+   263us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:25.097125 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:25.097127 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:25.097309 (+   182us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:25.097484 (+   175us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:25.097708 (+   224us) server_negotiation.cc:300] Negotiation successful
0504 14:07:25.097813 (+   105us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":147}
W20260504 14:07:25.099411 27308 tablet_service.cc:3018] Rejecting scan request for tablet 216fde87eab448c0aceb5924838926c4: Uninitialized: clean time has not yet been initialized
I20260504 14:07:25.101415 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 27265
I20260504 14:07:25.108064 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 27401
I20260504 14:07:25.114329 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 27537
I20260504 14:07:25.120044 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 27169
2026-05-04T14:07:25Z chronyd exiting
[       OK ] SecurityITest.TestAuthorizationOnChecksum (3592 ms)
[ RUN      ] SecurityITest.SmokeTestAsAuthorizedUser
Loading random data
Initializing database '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/principal' for realm 'KRBTEST.COM',
master key name 'K/M@KRBTEST.COM'
May 04 14:07:25 dist-test-slave-2x32 krb5kdc[27707](info): setting up network...
krb5kdc: setsockopt(10,IPV6_V6ONLY,1) worked
May 04 14:07:25 dist-test-slave-2x32 krb5kdc[27707](info): set up 2 sockets
May 04 14:07:25 dist-test-slave-2x32 krb5kdc[27707](info): commencing operation
krb5kdc: starting...
W20260504 14:07:27.152505 26619 mini_kdc.cc:121] Time spent starting KDC: real 2.008s	user 0.001s	sys 0.005s
WARNING: no policy specified for test-admin@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-admin@KRBTEST.COM" created.
WARNING: no policy specified for test-user@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-user@KRBTEST.COM" created.
WARNING: no policy specified for joe-interloper@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "joe-interloper@KRBTEST.COM" created.
Authenticating as principal slave/admin@KRBTEST.COM with password.
Entry for principal test-user with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/test-user.keytab.
Entry for principal test-user with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/test-user.keytab.
May 04 14:07:27 dist-test-slave-2x32 krb5kdc[27707](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903647, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
2026-05-04T14:07:27Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:07:27Z Disabled control of system clock
WARNING: no policy specified for kudu/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:27.303846 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:43237
--webserver_interface=127.25.254.254
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:38401
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:43237
--encrypt_data_at_rest=true
--rpc_trace_negotiation
--txn_manager_enabled=true with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:27.411887 27723 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:27.412129 27723 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:27.412184 27723 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:27.415639 27723 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:07:27.415714 27723 flags.cc:432] Enabled experimental flag: --txn_manager_enabled=true
W20260504 14:07:27.415735 27723 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:27.415757 27723 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:27.415776 27723 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:07:27.415794 27723 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:07:27.420385 27723 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38401
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:43237
--txn_manager_enabled=true
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:43237
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.27723
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:27.421519 27723 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:27.422501 27723 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:27.428272 27728 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:27.428290 27729 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:27.428272 27731 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:27.428601 27723 server_base.cc:1061] running on GCE node
I20260504 14:07:27.429112 27723 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:27.430032 27723 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:27.431241 27723 hybrid_clock.cc:648] HybridClock initialized: now 1777903647431201 us; error 57 us; skew 500 ppm
May 04 14:07:27 dist-test-slave-2x32 krb5kdc[27707](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903647, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:27.434093 27723 init.cc:377] Logged in from keytab as kudu/127.25.254.254@KRBTEST.COM (short username kudu)
I20260504 14:07:27.435235 27723 webserver.cc:492] Webserver started at http://127.25.254.254:37991/ using document root <none> and password file <none>
I20260504 14:07:27.435762 27723 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:27.435807 27723 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:27.435961 27723 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:27.437619 27723 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "c565044b9e4a4b8094dc715613b2df4b"
format_stamp: "Formatted at 2026-05-04 14:07:27 on dist-test-slave-2x32"
server_key: "d563459b6554f2b7874300b8f0c11a6b"
server_key_iv: "6d7e08c02eb4ef846fd41d27eaf54e15"
server_key_version: "encryptionkey@0"
I20260504 14:07:27.438046 27723 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "c565044b9e4a4b8094dc715613b2df4b"
format_stamp: "Formatted at 2026-05-04 14:07:27 on dist-test-slave-2x32"
server_key: "d563459b6554f2b7874300b8f0c11a6b"
server_key_iv: "6d7e08c02eb4ef846fd41d27eaf54e15"
server_key_version: "encryptionkey@0"
I20260504 14:07:27.441483 27723 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.001s	sys 0.001s
I20260504 14:07:27.443858 27738 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:27.445070 27723 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.000s	sys 0.001s
I20260504 14:07:27.445176 27723 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "c565044b9e4a4b8094dc715613b2df4b"
format_stamp: "Formatted at 2026-05-04 14:07:27 on dist-test-slave-2x32"
server_key: "d563459b6554f2b7874300b8f0c11a6b"
server_key_iv: "6d7e08c02eb4ef846fd41d27eaf54e15"
server_key_version: "encryptionkey@0"
I20260504 14:07:27.445266 27723 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:27.463573 27723 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:27.466624 27723 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:27.466786 27723 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:27.475724 27723 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:43237
I20260504 14:07:27.475741 27800 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:43237 every 8 connection(s)
I20260504 14:07:27.476815 27723 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:07:27.479694 27801 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:27.480134 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 27723
I20260504 14:07:27.480221 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:07:27.480469 26619 external_mini_cluster.cc:1468] Setting key ff496fb14f7ed89dad692a92daeb3041
I20260504 14:07:27.485570 27801 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b: Bootstrap starting.
May 04 14:07:27 dist-test-slave-2x32 krb5kdc[27707](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903647, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:27.487982 27801 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:27.488737 27801 log.cc:826] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:27.490705 27801 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b: No bootstrap required, opened a new log
I20260504 14:07:27.492800 27804 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:27.481746 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:57510 (local address 127.25.254.254:43237)
0504 14:07:27.482238 (+   492us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:27.482249 (+    11us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:27.482282 (+    33us) server_negotiation.cc:408] Connection header received
0504 14:07:27.482854 (+   572us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:27.482871 (+    17us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:27.483140 (+   269us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:27.483471 (+   331us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:27.484375 (+   904us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:27.485158 (+   783us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:27.485875 (+   717us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:27.486243 (+   368us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:27.488437 (+  2194us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:27.488454 (+    17us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:27.488466 (+    12us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:27.488495 (+    29us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:27.490359 (+  1864us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:27.490808 (+   449us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:27.490817 (+     9us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:27.490823 (+     6us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:27.490922 (+    99us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:27.491178 (+   256us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:27.491182 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:27.491183 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:27.491501 (+   318us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:27.491700 (+   199us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:27.491999 (+   299us) server_negotiation.cc:300] Negotiation successful
0504 14:07:27.492226 (+   227us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":293,"thread_start_us":139,"threads_started":1}
I20260504 14:07:27.493789 27801 raft_consensus.cc:359] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c565044b9e4a4b8094dc715613b2df4b" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 43237 } }
I20260504 14:07:27.494002 27801 raft_consensus.cc:385] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:27.494100 27801 raft_consensus.cc:740] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: c565044b9e4a4b8094dc715613b2df4b, State: Initialized, Role: FOLLOWER
I20260504 14:07:27.494535 27801 consensus_queue.cc:260] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c565044b9e4a4b8094dc715613b2df4b" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 43237 } }
I20260504 14:07:27.494695 27801 raft_consensus.cc:399] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:07:27.494771 27801 raft_consensus.cc:493] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:07:27.494882 27801 raft_consensus.cc:3060] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:27.495767 27801 raft_consensus.cc:515] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c565044b9e4a4b8094dc715613b2df4b" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 43237 } }
I20260504 14:07:27.496104 27801 leader_election.cc:304] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: c565044b9e4a4b8094dc715613b2df4b; no voters: 
I20260504 14:07:27.496399 27801 leader_election.cc:290] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:07:27.496567 27806 raft_consensus.cc:2804] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:27.496907 27806 raft_consensus.cc:697] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b [term 1 LEADER]: Becoming Leader. State: Replica: c565044b9e4a4b8094dc715613b2df4b, State: Running, Role: LEADER
I20260504 14:07:27.497200 27806 consensus_queue.cc:237] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c565044b9e4a4b8094dc715613b2df4b" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 43237 } }
I20260504 14:07:27.497560 27801 sys_catalog.cc:565] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:07:27.498976 27807 sys_catalog.cc:455] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "c565044b9e4a4b8094dc715613b2df4b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c565044b9e4a4b8094dc715613b2df4b" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 43237 } } }
I20260504 14:07:27.499099 27807 sys_catalog.cc:458] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:27.499480 27815 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:07:27.499428 27808 sys_catalog.cc:455] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b [sys.catalog]: SysCatalogTable state changed. Reason: New leader c565044b9e4a4b8094dc715613b2df4b. Latest consensus state: current_term: 1 leader_uuid: "c565044b9e4a4b8094dc715613b2df4b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c565044b9e4a4b8094dc715613b2df4b" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 43237 } } }
I20260504 14:07:27.499611 27808 sys_catalog.cc:458] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:27.502656 27815 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:07:27.508292 27815 catalog_manager.cc:1357] Generated new cluster ID: 66f581e0905143c886eb18b9ee5558ba
I20260504 14:07:27.508399 27815 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:07:27.518407 27815 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:07:27.519285 27815 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:07:27.525445 27815 catalog_manager.cc:6044] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b: Generated new TSK 0
I20260504 14:07:27.526060 27815 catalog_manager.cc:1524] Initializing in-progress tserver states...
WARNING: no policy specified for kudu/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:27.589946 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:0
--local_ip_for_outbound_sockets=127.25.254.193
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:43237
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:38401
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--enable_txn_system_client_init=true with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:27.695946 27829 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:27.696216 27829 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:27.696316 27829 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:27.700520 27829 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:27.700618 27829 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:27.700713 27829 flags.cc:432] Enabled experimental flag: --enable_txn_system_client_init=true
W20260504 14:07:27.700798 27829 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:07:27.705282 27829 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38401
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-0/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=0
--webserver_require_spnego=true
--enable_txn_system_client_init=true
--tserver_master_addrs=127.25.254.254:43237
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.27829
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:27.706547 27829 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:27.707440 27829 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:27.714488 27835 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:27.714707 27837 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:27.714495 27834 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:27.714756 27829 server_base.cc:1061] running on GCE node
I20260504 14:07:27.715309 27829 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:27.715919 27829 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:27.717095 27829 hybrid_clock.cc:648] HybridClock initialized: now 1777903647717072 us; error 40 us; skew 500 ppm
May 04 14:07:27 dist-test-slave-2x32 krb5kdc[27707](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903647, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:27.719870 27829 init.cc:377] Logged in from keytab as kudu/127.25.254.193@KRBTEST.COM (short username kudu)
I20260504 14:07:27.720904 27829 webserver.cc:492] Webserver started at http://127.25.254.193:43037/ using document root <none> and password file <none>
I20260504 14:07:27.721434 27829 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:27.721480 27829 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:27.721637 27829 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:27.723436 27829 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-0/data/instance:
uuid: "97c217ef7f0e437e90bc46ff50b42155"
format_stamp: "Formatted at 2026-05-04 14:07:27 on dist-test-slave-2x32"
server_key: "7a5cfc68f5329d69f139dff3a67343df"
server_key_iv: "0958af91407751919da68f5f91496f19"
server_key_version: "encryptionkey@0"
I20260504 14:07:27.723874 27829 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance:
uuid: "97c217ef7f0e437e90bc46ff50b42155"
format_stamp: "Formatted at 2026-05-04 14:07:27 on dist-test-slave-2x32"
server_key: "7a5cfc68f5329d69f139dff3a67343df"
server_key_iv: "0958af91407751919da68f5f91496f19"
server_key_version: "encryptionkey@0"
I20260504 14:07:27.727421 27829 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.004s	sys 0.000s
I20260504 14:07:27.729553 27844 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:27.730787 27829 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:07:27.730893 27829 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "97c217ef7f0e437e90bc46ff50b42155"
format_stamp: "Formatted at 2026-05-04 14:07:27 on dist-test-slave-2x32"
server_key: "7a5cfc68f5329d69f139dff3a67343df"
server_key_iv: "0958af91407751919da68f5f91496f19"
server_key_version: "encryptionkey@0"
I20260504 14:07:27.730979 27829 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:27.742007 27829 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:27.744849 27829 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:27.745020 27829 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:27.746716 27829 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:27.746804 27829 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:27.746858 27829 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:27.746891 27829 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
May 04 14:07:27 dist-test-slave-2x32 krb5kdc[27707](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903647, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:27.758827 27829 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:33467
I20260504 14:07:27.758834 27959 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:33467 every 8 connection(s)
I20260504 14:07:27.759852 27804 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:27.747348 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:44461 (local address 127.25.254.254:43237)
0504 14:07:27.747600 (+   252us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:27.747604 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:27.748231 (+   627us) server_negotiation.cc:408] Connection header received
0504 14:07:27.749128 (+   897us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:27.749132 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:27.749243 (+   111us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:27.749331 (+    88us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:27.751067 (+  1736us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:27.751968 (+   901us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:27.752865 (+   897us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:27.753055 (+   190us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:27.755465 (+  2410us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:27.755494 (+    29us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:27.755497 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:27.755524 (+    27us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:27.757522 (+  1998us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:27.758128 (+   606us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:27.758131 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:27.758133 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:27.758249 (+   116us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:27.758742 (+   493us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:27.758749 (+     7us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:27.758754 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:27.758962 (+   208us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:27.759066 (+   104us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:27.759545 (+   479us) server_negotiation.cc:300] Negotiation successful
0504 14:07:27.759693 (+   148us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":51}
I20260504 14:07:27.759974 27829 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
I20260504 14:07:27.760598 27853 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:27.747651 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:43237 (local address 127.25.254.193:44461)
0504 14:07:27.748093 (+   442us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:27.748125 (+    32us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:27.748937 (+   812us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:27.749538 (+   601us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:27.749550 (+    12us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:27.750131 (+   581us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:27.750882 (+   751us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:27.750901 (+    19us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:27.752112 (+  1211us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:27.752115 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:27.752723 (+   608us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:27.752732 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:27.752963 (+   231us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:27.753657 (+   694us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:27.753681 (+    24us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:27.755289 (+  1608us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:27.757660 (+  2371us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:27.757666 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:27.757679 (+    13us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:27.757982 (+   303us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:27.758465 (+   483us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:27.758469 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:27.758471 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:27.758607 (+   136us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:27.759077 (+   470us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:27.759082 (+     5us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:27.759293 (+   211us) client_negotiation.cc:770] Sending connection context
0504 14:07:27.759490 (+   197us) client_negotiation.cc:241] Negotiation successful
0504 14:07:27.759763 (+   273us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":277,"thread_start_us":115,"threads_started":1}
I20260504 14:07:27.762032 27960 heartbeater.cc:344] Connected to a master server at 127.25.254.254:43237
I20260504 14:07:27.762288 27960 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:27.762876 27960 heartbeater.cc:507] Master 127.25.254.254:43237 requested a full tablet report, sending...
I20260504 14:07:27.764644 27753 ts_manager.cc:194] Registered new tserver with Master: 97c217ef7f0e437e90bc46ff50b42155 (127.25.254.193:33467)
I20260504 14:07:27.766124 27753 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.193@KRBTEST.COM'} at 127.25.254.193:44461
I20260504 14:07:27.766300 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 27829
I20260504 14:07:27.766453 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance
I20260504 14:07:27.766793 26619 external_mini_cluster.cc:1468] Setting key 5076d642df18b743db13f5d98c5969f5
I20260504 14:07:27.771962 27966 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:27.764352 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:43237 (local address 127.25.254.193:34845)
0504 14:07:27.764710 (+   358us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:27.764722 (+    12us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:27.764886 (+   164us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:27.765135 (+   249us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:27.765137 (+     2us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:27.765297 (+   160us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:27.765539 (+   242us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:27.765544 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:27.766576 (+  1032us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:27.766580 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:27.767095 (+   515us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:27.767103 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:27.767572 (+   469us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:27.768241 (+   669us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:27.768258 (+    17us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:27.768580 (+   322us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:27.770520 (+  1940us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:27.770527 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:27.770532 (+     5us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:27.770795 (+   263us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:27.771077 (+   282us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:27.771080 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:27.771081 (+     1us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:27.771129 (+    48us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:27.771528 (+   399us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:27.771530 (+     2us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:27.771624 (+    94us) client_negotiation.cc:770] Sending connection context
0504 14:07:27.771711 (+    87us) client_negotiation.cc:241] Negotiation successful
0504 14:07:27.771814 (+   103us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":294,"thread_start_us":93,"threads_started":1}
I20260504 14:07:27.772132 27804 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:27.764468 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:34845 (local address 127.25.254.254:43237)
0504 14:07:27.764616 (+   148us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:27.764620 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:27.764815 (+   195us) server_negotiation.cc:408] Connection header received
0504 14:07:27.764979 (+   164us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:27.764983 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:27.765025 (+    42us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:27.765113 (+    88us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:27.765656 (+   543us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:27.766431 (+   775us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:27.767254 (+   823us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:27.767435 (+   181us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:27.768710 (+  1275us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:27.768730 (+    20us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:27.768737 (+     7us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:27.768769 (+    32us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:27.770397 (+  1628us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:27.770910 (+   513us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:27.770916 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:27.770919 (+     3us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:27.770972 (+    53us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:27.771222 (+   250us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:27.771228 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:27.771232 (+     4us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:27.771416 (+   184us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:27.771523 (+   107us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:27.771796 (+   273us) server_negotiation.cc:300] Negotiation successful
0504 14:07:27.771920 (+   124us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":59}
WARNING: no policy specified for kudu/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:27.826982 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:0
--local_ip_for_outbound_sockets=127.25.254.194
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:43237
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:38401
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--enable_txn_system_client_init=true with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:27.931589 27971 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:27.931823 27971 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:27.931882 27971 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:27.935487 27971 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:27.935567 27971 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:27.935645 27971 flags.cc:432] Enabled experimental flag: --enable_txn_system_client_init=true
W20260504 14:07:27.935737 27971 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:07:27.940160 27971 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38401
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-1/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=0
--webserver_require_spnego=true
--enable_txn_system_client_init=true
--tserver_master_addrs=127.25.254.254:43237
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.27971
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:27.941351 27971 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:27.942302 27971 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:27.948756 27979 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:27.948926 27971 server_base.cc:1061] running on GCE node
W20260504 14:07:27.948782 27976 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:27.948756 27977 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:27.949546 27971 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:27.950059 27971 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:27.951212 27971 hybrid_clock.cc:648] HybridClock initialized: now 1777903647951176 us; error 51 us; skew 500 ppm
May 04 14:07:27 dist-test-slave-2x32 krb5kdc[27707](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903647, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:27.953933 27971 init.cc:377] Logged in from keytab as kudu/127.25.254.194@KRBTEST.COM (short username kudu)
I20260504 14:07:27.955024 27971 webserver.cc:492] Webserver started at http://127.25.254.194:40323/ using document root <none> and password file <none>
I20260504 14:07:27.955654 27971 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:27.955716 27971 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:27.955919 27971 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:27.957623 27971 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-1/data/instance:
uuid: "d320fa750b4a4bb38f3f56d15216be57"
format_stamp: "Formatted at 2026-05-04 14:07:27 on dist-test-slave-2x32"
server_key: "e99c45818a4599a9bd08711394450677"
server_key_iv: "49bf8d5806e9e75caa9421cfd7fe25cb"
server_key_version: "encryptionkey@0"
I20260504 14:07:27.958107 27971 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance:
uuid: "d320fa750b4a4bb38f3f56d15216be57"
format_stamp: "Formatted at 2026-05-04 14:07:27 on dist-test-slave-2x32"
server_key: "e99c45818a4599a9bd08711394450677"
server_key_iv: "49bf8d5806e9e75caa9421cfd7fe25cb"
server_key_version: "encryptionkey@0"
I20260504 14:07:27.961737 27971 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:07:27.964507 27986 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:27.965569 27971 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:07:27.965669 27971 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "d320fa750b4a4bb38f3f56d15216be57"
format_stamp: "Formatted at 2026-05-04 14:07:27 on dist-test-slave-2x32"
server_key: "e99c45818a4599a9bd08711394450677"
server_key_iv: "49bf8d5806e9e75caa9421cfd7fe25cb"
server_key_version: "encryptionkey@0"
I20260504 14:07:27.965799 27971 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:27.983543 27971 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:27.986598 27971 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:27.986827 27971 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:27.988416 27971 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:27.988466 27971 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:27.988507 27971 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:27.988523 27971 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
May 04 14:07:27 dist-test-slave-2x32 krb5kdc[27707](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903647, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:28.001144 27971 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:46767
I20260504 14:07:28.001216 28101 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:46767 every 8 connection(s)
I20260504 14:07:28.002452 27971 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
I20260504 14:07:28.002622 27804 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:27.989083 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:60255 (local address 127.25.254.254:43237)
0504 14:07:27.989228 (+   145us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:27.989231 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:27.990833 (+  1602us) server_negotiation.cc:408] Connection header received
0504 14:07:27.991956 (+  1123us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:27.991960 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:27.992029 (+    69us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:27.992154 (+   125us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:27.994304 (+  2150us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:27.995030 (+   726us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:27.995833 (+   803us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:27.996013 (+   180us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:27.998505 (+  2492us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:27.998534 (+    29us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:27.998536 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:27.998564 (+    28us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:28.000416 (+  1852us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:28.001004 (+   588us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:28.001010 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:28.001012 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:28.001074 (+    62us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:28.001479 (+   405us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:28.001483 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:28.001484 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:28.001674 (+   190us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:28.001773 (+    99us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.002270 (+   497us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.002429 (+   159us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":48}
I20260504 14:07:28.003304 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 27971
I20260504 14:07:28.003338 27995 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:27.989448 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:43237 (local address 127.25.254.194:60255)
0504 14:07:27.990688 (+  1240us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:27.990721 (+    33us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:27.991653 (+   932us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:27.992754 (+  1101us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:27.992769 (+    15us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:27.993381 (+   612us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:27.994109 (+   728us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:27.994129 (+    20us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:27.995168 (+  1039us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:27.995172 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:27.995704 (+   532us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:27.995712 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:27.995933 (+   221us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:27.996591 (+   658us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:27.996651 (+    60us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:27.998296 (+  1645us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:28.000544 (+  2248us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:28.000549 (+     5us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:28.000559 (+    10us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:28.000894 (+   335us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:28.001182 (+   288us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:28.001185 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:28.001188 (+     3us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:28.001315 (+   127us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:28.001776 (+   461us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:28.001780 (+     4us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:28.001989 (+   209us) client_negotiation.cc:770] Sending connection context
0504 14:07:28.002196 (+   207us) client_negotiation.cc:241] Negotiation successful
0504 14:07:28.002496 (+   300us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":1043,"thread_start_us":807,"threads_started":1}
I20260504 14:07:28.003427 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance
I20260504 14:07:28.003688 26619 external_mini_cluster.cc:1468] Setting key c3b66faba06fb38397225b39be6f2c5d
I20260504 14:07:28.005291 28102 heartbeater.cc:344] Connected to a master server at 127.25.254.254:43237
I20260504 14:07:28.005491 28102 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:28.006110 28102 heartbeater.cc:507] Master 127.25.254.254:43237 requested a full tablet report, sending...
I20260504 14:07:28.007272 27753 ts_manager.cc:194] Registered new tserver with Master: d320fa750b4a4bb38f3f56d15216be57 (127.25.254.194:46767)
I20260504 14:07:28.008020 27753 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.194@KRBTEST.COM'} at 127.25.254.194:60255
WARNING: no policy specified for kudu/127.25.254.195@KRBTEST.COM; defaulting to no policy
I20260504 14:07:28.013414 28109 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.006962 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:43237 (local address 127.25.254.194:42065)
0504 14:07:28.007265 (+   303us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:28.007280 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:28.007401 (+   121us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:28.007645 (+   244us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:28.007648 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:28.007833 (+   185us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:28.008043 (+   210us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.008049 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.008754 (+   705us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.008759 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:28.009214 (+   455us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.009222 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.009356 (+   134us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.009897 (+   541us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:28.009909 (+    12us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:28.010263 (+   354us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:28.012121 (+  1858us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:28.012125 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:28.012127 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:28.012301 (+   174us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:28.012564 (+   263us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:28.012567 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:28.012569 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:28.012614 (+    45us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:28.012964 (+   350us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:28.012968 (+     4us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:28.013050 (+    82us) client_negotiation.cc:770] Sending connection context
0504 14:07:28.013165 (+   115us) client_negotiation.cc:241] Negotiation successful
0504 14:07:28.013273 (+   108us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":228,"thread_start_us":90,"threads_started":1}
I20260504 14:07:28.013562 27804 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.007005 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:42065 (local address 127.25.254.254:43237)
0504 14:07:28.007151 (+   146us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:28.007157 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:28.007359 (+   202us) server_negotiation.cc:408] Connection header received
0504 14:07:28.007488 (+   129us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:28.007491 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:28.007530 (+    39us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:28.007631 (+   101us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:28.008164 (+   533us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.008644 (+   480us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.009330 (+   686us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.009540 (+   210us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.010453 (+   913us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:28.010470 (+    17us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:28.010473 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:28.010501 (+    28us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:28.011988 (+  1487us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:28.012406 (+   418us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:28.012411 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:28.012413 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:28.012461 (+    48us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:28.012708 (+   247us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:28.012711 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:28.012713 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:28.012871 (+   158us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:28.012975 (+   104us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.013256 (+   281us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.013383 (+   127us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":46}
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:28.058082 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:0
--local_ip_for_outbound_sockets=127.25.254.195
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:43237
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:38401
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--enable_txn_system_client_init=true with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:28.163005 28113 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:28.163293 28113 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:28.163386 28113 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:28.166769 28113 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:28.166836 28113 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:28.166887 28113 flags.cc:432] Enabled experimental flag: --enable_txn_system_client_init=true
W20260504 14:07:28.166987 28113 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:07:28.171422 28113 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38401
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-2/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=0
--webserver_require_spnego=true
--enable_txn_system_client_init=true
--tserver_master_addrs=127.25.254.254:43237
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.28113
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:28.172644 28113 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:28.173691 28113 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:28.180686 28118 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:28.180677 28121 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:28.180799 28113 server_base.cc:1061] running on GCE node
W20260504 14:07:28.180678 28119 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:28.181247 28113 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:28.181804 28113 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:28.183007 28113 hybrid_clock.cc:648] HybridClock initialized: now 1777903648182960 us; error 58 us; skew 500 ppm
May 04 14:07:28 dist-test-slave-2x32 krb5kdc[27707](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903648, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:28.186123 28113 init.cc:377] Logged in from keytab as kudu/127.25.254.195@KRBTEST.COM (short username kudu)
I20260504 14:07:28.187378 28113 webserver.cc:492] Webserver started at http://127.25.254.195:37601/ using document root <none> and password file <none>
I20260504 14:07:28.187966 28113 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:28.188016 28113 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:28.188236 28113 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:28.190011 28113 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-2/data/instance:
uuid: "33e0e159305a42cb82eabfd31e594bd3"
format_stamp: "Formatted at 2026-05-04 14:07:28 on dist-test-slave-2x32"
server_key: "5bde7d68050a9cfa43e984f0ef60fab7"
server_key_iv: "3ca3caaf742868bf675e08312a5a938a"
server_key_version: "encryptionkey@0"
I20260504 14:07:28.190579 28113 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance:
uuid: "33e0e159305a42cb82eabfd31e594bd3"
format_stamp: "Formatted at 2026-05-04 14:07:28 on dist-test-slave-2x32"
server_key: "5bde7d68050a9cfa43e984f0ef60fab7"
server_key_iv: "3ca3caaf742868bf675e08312a5a938a"
server_key_version: "encryptionkey@0"
I20260504 14:07:28.194063 28113 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.003s	sys 0.000s
I20260504 14:07:28.196396 28128 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:28.197645 28113 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:07:28.197810 28113 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "33e0e159305a42cb82eabfd31e594bd3"
format_stamp: "Formatted at 2026-05-04 14:07:28 on dist-test-slave-2x32"
server_key: "5bde7d68050a9cfa43e984f0ef60fab7"
server_key_iv: "3ca3caaf742868bf675e08312a5a938a"
server_key_version: "encryptionkey@0"
I20260504 14:07:28.197935 28113 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:28.232827 28113 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:28.236311 28113 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:28.236534 28113 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:28.238106 28113 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:28.238211 28113 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:28.238284 28113 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:28.238319 28113 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
May 04 14:07:28 dist-test-slave-2x32 krb5kdc[27707](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903648, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:28.251046 28113 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:38097
I20260504 14:07:28.251147 28243 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:38097 every 8 connection(s)
I20260504 14:07:28.252141 28113 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
I20260504 14:07:28.252270 27804 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.238977 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:54299 (local address 127.25.254.254:43237)
0504 14:07:28.239125 (+   148us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:28.239129 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:28.239983 (+   854us) server_negotiation.cc:408] Connection header received
0504 14:07:28.240932 (+   949us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:28.240936 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:28.240996 (+    60us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:28.241107 (+   111us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:28.242761 (+  1654us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.243510 (+   749us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.244234 (+   724us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.244506 (+   272us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.247813 (+  3307us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:28.247842 (+    29us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:28.247847 (+     5us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:28.247886 (+    39us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:28.249936 (+  2050us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:28.250624 (+   688us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:28.250630 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:28.250634 (+     4us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:28.250697 (+    63us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:28.251085 (+   388us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:28.251090 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:28.251094 (+     4us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:28.251293 (+   199us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:28.251396 (+   103us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.251969 (+   573us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.252089 (+   120us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":52}
I20260504 14:07:28.252884 28137 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.239311 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:43237 (local address 127.25.254.195:54299)
0504 14:07:28.239822 (+   511us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:28.239857 (+    35us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:28.240718 (+   861us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:28.241268 (+   550us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:28.241276 (+     8us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:28.241908 (+   632us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:28.242611 (+   703us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.242624 (+    13us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.243653 (+  1029us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.243657 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:28.244094 (+   437us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.244101 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.244360 (+   259us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.245093 (+   733us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:28.245121 (+    28us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:28.247608 (+  2487us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:28.250085 (+  2477us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:28.250090 (+     5us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:28.250103 (+    13us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:28.250460 (+   357us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:28.250824 (+   364us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:28.250830 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:28.250834 (+     4us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:28.250960 (+   126us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:28.251484 (+   524us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:28.251489 (+     5us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:28.251700 (+   211us) client_negotiation.cc:770] Sending connection context
0504 14:07:28.251871 (+   171us) client_negotiation.cc:241] Negotiation successful
0504 14:07:28.252063 (+   192us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":332,"thread_start_us":141,"threads_started":1}
I20260504 14:07:28.254107 28244 heartbeater.cc:344] Connected to a master server at 127.25.254.254:43237
I20260504 14:07:28.254341 28244 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:28.254794 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 28113
I20260504 14:07:28.254914 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance
I20260504 14:07:28.255028 28244 heartbeater.cc:507] Master 127.25.254.254:43237 requested a full tablet report, sending...
I20260504 14:07:28.255188 26619 external_mini_cluster.cc:1468] Setting key 71f457422f20b6d069c3aedac54ad09d
I20260504 14:07:28.256155 27753 ts_manager.cc:194] Registered new tserver with Master: 33e0e159305a42cb82eabfd31e594bd3 (127.25.254.195:38097)
I20260504 14:07:28.256870 27753 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.195@KRBTEST.COM'} at 127.25.254.195:54299
I20260504 14:07:28.257589 26619 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
I20260504 14:07:28.263767 27804 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.256619 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:54921 (local address 127.25.254.254:43237)
0504 14:07:28.256750 (+   131us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:28.256754 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:28.256766 (+    12us) server_negotiation.cc:408] Connection header received
0504 14:07:28.256798 (+    32us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:28.256801 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:28.256843 (+    42us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:28.256952 (+   109us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:28.257754 (+   802us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.258491 (+   737us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.259441 (+   950us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.259673 (+   232us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.260414 (+   741us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:28.260436 (+    22us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:28.260442 (+     6us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:28.260475 (+    33us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:28.262248 (+  1773us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:28.262634 (+   386us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:28.262640 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:28.262645 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:28.262702 (+    57us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:28.262982 (+   280us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:28.262984 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:28.262986 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:28.263167 (+   181us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:28.263245 (+    78us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.263542 (+   297us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.263639 (+    97us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":53}
I20260504 14:07:28.263929 28250 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.256221 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:43237 (local address 127.25.254.195:54921)
0504 14:07:28.256506 (+   285us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:28.256531 (+    25us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:28.256617 (+    86us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:28.257056 (+   439us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:28.257061 (+     5us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:28.257235 (+   174us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:28.257540 (+   305us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.257575 (+    35us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.258629 (+  1054us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.258636 (+     7us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:28.259241 (+   605us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.259252 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.259383 (+   131us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.259969 (+   586us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:28.259992 (+    23us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:28.260279 (+   287us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:28.262362 (+  2083us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:28.262365 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:28.262367 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:28.262530 (+   163us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:28.262818 (+   288us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:28.262825 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:28.262827 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:28.262888 (+    61us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:28.263268 (+   380us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:28.263274 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:28.263375 (+   101us) client_negotiation.cc:770] Sending connection context
0504 14:07:28.263490 (+   115us) client_negotiation.cc:241] Negotiation successful
0504 14:07:28.263646 (+   156us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":197,"thread_start_us":101,"threads_started":1}
May 04 14:07:28 dist-test-slave-2x32 krb5kdc[27707](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903648, etypes {rep=17 tkt=17 ses=17}, test-user@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-user@KRBTEST.COM: 
May 04 14:07:28 dist-test-slave-2x32 krb5kdc[27707](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903648, etypes {rep=17 tkt=17 ses=17}, test-user@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:28.283974 27804 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.276182 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:43976 (local address 127.25.254.254:43237)
0504 14:07:28.276355 (+   173us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:28.276359 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:28.276371 (+    12us) server_negotiation.cc:408] Connection header received
0504 14:07:28.276412 (+    41us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:28.276414 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:28.276456 (+    42us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:28.276547 (+    91us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:28.277445 (+   898us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.277943 (+   498us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.278665 (+   722us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.278883 (+   218us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.281012 (+  2129us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:28.281038 (+    26us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:28.281040 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:28.281067 (+    27us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:28.282441 (+  1374us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:28.282847 (+   406us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:28.282853 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:28.282855 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:28.282910 (+    55us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:28.283159 (+   249us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:28.283163 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:28.283165 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:28.283315 (+   150us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:28.283464 (+   149us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.283691 (+   227us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.283802 (+   111us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":65}
I20260504 14:07:28.286968 27755 catalog_manager.cc:2257] Servicing CreateTable request from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:43976:
name: "test-table"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "val"
    type: INT32
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
W20260504 14:07:28.289664 27755 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-table in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260504 14:07:28.303731 28261 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.298601 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:33467 (local address 127.0.0.1:57820)
0504 14:07:28.299423 (+   822us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:28.299452 (+    29us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:28.299583 (+   131us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:28.300413 (+   830us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:28.300420 (+     7us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:28.300443 (+    23us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:28.300738 (+   295us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.300750 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.302215 (+  1465us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.302219 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:28.303200 (+   981us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.303209 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.303290 (+    81us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.303361 (+    71us) client_negotiation.cc:770] Sending connection context
0504 14:07:28.303445 (+    84us) client_negotiation.cc:241] Negotiation successful
0504 14:07:28.303518 (+    73us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":745,"spinlock_wait_cycles":2432,"thread_start_us":123,"threads_started":1}
I20260504 14:07:28.303969 28264 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.298914 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:46767 (local address 127.0.0.1:60086)
0504 14:07:28.300224 (+  1310us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:28.300236 (+    12us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:28.300361 (+   125us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:28.301006 (+   645us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:28.301009 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:28.301025 (+    16us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:28.301203 (+   178us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.301207 (+     4us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.302689 (+  1482us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.302695 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:28.303563 (+   868us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.303574 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.303689 (+   115us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.303703 (+    14us) client_negotiation.cc:770] Sending connection context
0504 14:07:28.303757 (+    54us) client_negotiation.cc:241] Negotiation successful
0504 14:07:28.303811 (+    54us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":1238,"thread_start_us":93,"threads_started":1}
I20260504 14:07:28.304138 28263 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.298433 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:38097 (local address 127.0.0.1:47418)
0504 14:07:28.299404 (+   971us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:28.299444 (+    40us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:28.299581 (+   137us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:28.300336 (+   755us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:28.300343 (+     7us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:28.300382 (+    39us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:28.300659 (+   277us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.300666 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.302215 (+  1549us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.302219 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:28.303308 (+  1089us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.303315 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.303411 (+    96us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.303428 (+    17us) client_negotiation.cc:770] Sending connection context
0504 14:07:28.303477 (+    49us) client_negotiation.cc:241] Negotiation successful
0504 14:07:28.303518 (+    41us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":793,"thread_start_us":75,"threads_started":1}
I20260504 14:07:28.304625 28265 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.299472 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:57820 (local address 127.25.254.193:33467)
0504 14:07:28.299935 (+   463us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:28.299942 (+     7us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:28.299959 (+    17us) server_negotiation.cc:408] Connection header received
0504 14:07:28.300034 (+    75us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:28.300039 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:28.300252 (+   213us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:28.300422 (+   170us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:28.300880 (+   458us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.302054 (+  1174us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.303704 (+  1650us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.304248 (+   544us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.304381 (+   133us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.304461 (+    80us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.304531 (+    70us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":366,"thread_start_us":82,"threads_started":1}
I20260504 14:07:28.305001 28262 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.298545 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:47418 (local address 127.25.254.195:38097)
0504 14:07:28.299039 (+   494us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:28.299046 (+     7us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:28.299716 (+   670us) server_negotiation.cc:408] Connection header received
0504 14:07:28.299804 (+    88us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:28.299812 (+     8us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:28.300000 (+   188us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:28.300141 (+   141us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:28.300780 (+   639us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.302054 (+  1274us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.303452 (+  1398us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.304618 (+  1166us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.304711 (+    93us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.304785 (+    74us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.304872 (+    87us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":339,"thread_start_us":111,"threads_started":1}
I20260504 14:07:28.306059 28266 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.299811 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:60086 (local address 127.25.254.194:46767)
0504 14:07:28.300142 (+   331us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:28.300151 (+     9us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:28.300321 (+   170us) server_negotiation.cc:408] Connection header received
0504 14:07:28.300524 (+   203us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:28.300532 (+     8us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:28.300693 (+   161us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:28.300858 (+   165us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:28.301378 (+   520us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.302551 (+  1173us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.304240 (+  1689us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.305613 (+  1373us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.305746 (+   133us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.305813 (+    67us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.305899 (+    86us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":227,"thread_start_us":139,"threads_started":1}
I20260504 14:07:28.307762 28178 tablet_service.cc:1511] Processing CreateTablet for tablet 5ab71a6c0a3c44059d5fe1493e51408b (DEFAULT_TABLE table=test-table [id=adf8f18488374cbdad6dc720f4f36574]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:28.307762 27894 tablet_service.cc:1511] Processing CreateTablet for tablet 5ab71a6c0a3c44059d5fe1493e51408b (DEFAULT_TABLE table=test-table [id=adf8f18488374cbdad6dc720f4f36574]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:28.308350 28036 tablet_service.cc:1511] Processing CreateTablet for tablet 5ab71a6c0a3c44059d5fe1493e51408b (DEFAULT_TABLE table=test-table [id=adf8f18488374cbdad6dc720f4f36574]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:28.308768 27894 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 5ab71a6c0a3c44059d5fe1493e51408b. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:28.308768 28178 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 5ab71a6c0a3c44059d5fe1493e51408b. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:28.309231 28036 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 5ab71a6c0a3c44059d5fe1493e51408b. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:28.315583 28268 tablet_bootstrap.cc:492] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155: Bootstrap starting.
I20260504 14:07:28.315757 28269 tablet_bootstrap.cc:492] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3: Bootstrap starting.
I20260504 14:07:28.317430 28268 tablet_bootstrap.cc:654] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:28.317445 28269 tablet_bootstrap.cc:654] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:28.318113 28268 log.cc:826] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:28.318524 28267 tablet_bootstrap.cc:492] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57: Bootstrap starting.
I20260504 14:07:28.318673 28269 log.cc:826] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:28.320113 28268 tablet_bootstrap.cc:492] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155: No bootstrap required, opened a new log
I20260504 14:07:28.320364 28268 ts_tablet_manager.cc:1403] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155: Time spent bootstrapping tablet: real 0.005s	user 0.004s	sys 0.000s
I20260504 14:07:28.320508 28269 tablet_bootstrap.cc:492] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3: No bootstrap required, opened a new log
I20260504 14:07:28.320588 28267 tablet_bootstrap.cc:654] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:28.320711 28269 ts_tablet_manager.cc:1403] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3: Time spent bootstrapping tablet: real 0.005s	user 0.004s	sys 0.000s
I20260504 14:07:28.321727 28267 log.cc:826] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:28.323534 28269 raft_consensus.cc:359] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } } peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } }
I20260504 14:07:28.323662 28267 tablet_bootstrap.cc:492] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57: No bootstrap required, opened a new log
I20260504 14:07:28.323746 28269 raft_consensus.cc:385] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:28.323644 28268 raft_consensus.cc:359] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } } peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } }
I20260504 14:07:28.323812 28269 raft_consensus.cc:740] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 33e0e159305a42cb82eabfd31e594bd3, State: Initialized, Role: FOLLOWER
I20260504 14:07:28.323832 28268 raft_consensus.cc:385] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:28.323858 28267 ts_tablet_manager.cc:1403] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57: Time spent bootstrapping tablet: real 0.005s	user 0.004s	sys 0.000s
I20260504 14:07:28.323891 28268 raft_consensus.cc:740] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 97c217ef7f0e437e90bc46ff50b42155, State: Initialized, Role: FOLLOWER
I20260504 14:07:28.324254 28269 consensus_queue.cc:260] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } } peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } }
I20260504 14:07:28.324306 28268 consensus_queue.cc:260] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } } peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } }
I20260504 14:07:28.324944 27960 heartbeater.cc:499] Master 127.25.254.254:43237 was elected leader, sending a full tablet report...
I20260504 14:07:28.325213 28269 ts_tablet_manager.cc:1434] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3: Time spent starting tablet: real 0.004s	user 0.005s	sys 0.000s
I20260504 14:07:28.325645 28244 heartbeater.cc:499] Master 127.25.254.254:43237 was elected leader, sending a full tablet report...
I20260504 14:07:28.325999 28268 ts_tablet_manager.cc:1434] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155: Time spent starting tablet: real 0.006s	user 0.004s	sys 0.000s
I20260504 14:07:28.326836 28267 raft_consensus.cc:359] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } } peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } }
I20260504 14:07:28.327026 28267 raft_consensus.cc:385] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:28.327081 28267 raft_consensus.cc:740] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: d320fa750b4a4bb38f3f56d15216be57, State: Initialized, Role: FOLLOWER
I20260504 14:07:28.327534 28267 consensus_queue.cc:260] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } } peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } }
I20260504 14:07:28.328238 28102 heartbeater.cc:499] Master 127.25.254.254:43237 was elected leader, sending a full tablet report...
I20260504 14:07:28.328414 28267 ts_tablet_manager.cc:1434] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57: Time spent starting tablet: real 0.005s	user 0.005s	sys 0.000s
I20260504 14:07:28.453877 28275 raft_consensus.cc:493] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:07:28.454115 28275 raft_consensus.cc:515] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } } peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } }
I20260504 14:07:28.455399 28275 leader_election.cc:290] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 33e0e159305a42cb82eabfd31e594bd3 (127.25.254.195:38097), 97c217ef7f0e437e90bc46ff50b42155 (127.25.254.193:33467)
I20260504 14:07:28.458202 27995 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.455609 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:38097 (local address 127.25.254.194:35673)
0504 14:07:28.455738 (+   129us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:28.455757 (+    19us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:28.455859 (+   102us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:28.456149 (+   290us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:28.456152 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:28.456172 (+    20us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:28.456391 (+   219us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.456397 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.457161 (+   764us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.457168 (+     7us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:28.457755 (+   587us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.457762 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.457905 (+   143us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.457924 (+    19us) client_negotiation.cc:770] Sending connection context
0504 14:07:28.457976 (+    52us) client_negotiation.cc:241] Negotiation successful
0504 14:07:28.458037 (+    61us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":38}
I20260504 14:07:28.458648 28262 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.455723 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:35673 (local address 127.25.254.195:38097)
0504 14:07:28.455872 (+   149us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:28.455878 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:28.455900 (+    22us) server_negotiation.cc:408] Connection header received
0504 14:07:28.455955 (+    55us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:28.455958 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:28.456021 (+    63us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:28.456122 (+   101us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:28.456521 (+   399us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.457042 (+   521us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.457885 (+   843us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.458415 (+   530us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.458452 (+    37us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.458506 (+    54us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.458558 (+    52us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":54}
I20260504 14:07:28.459249 28197 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5ab71a6c0a3c44059d5fe1493e51408b" candidate_uuid: "d320fa750b4a4bb38f3f56d15216be57" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "33e0e159305a42cb82eabfd31e594bd3" is_pre_election: true
I20260504 14:07:28.459424 28276 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.455723 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:33467 (local address 127.25.254.194:40295)
0504 14:07:28.456102 (+   379us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:28.456119 (+    17us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:28.456228 (+   109us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:28.456502 (+   274us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:28.456505 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:28.456524 (+    19us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:28.456779 (+   255us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.456784 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.458112 (+  1328us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.458118 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:28.459004 (+   886us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.459012 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.459145 (+   133us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.459159 (+    14us) client_negotiation.cc:770] Sending connection context
0504 14:07:28.459210 (+    51us) client_negotiation.cc:241] Negotiation successful
0504 14:07:28.459262 (+    52us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":310,"thread_start_us":139,"threads_started":1}
I20260504 14:07:28.459576 28197 raft_consensus.cc:2468] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate d320fa750b4a4bb38f3f56d15216be57 in term 0.
I20260504 14:07:28.459929 28265 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.455915 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:40295 (local address 127.25.254.193:33467)
0504 14:07:28.456046 (+   131us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:28.456050 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:28.456198 (+   148us) server_negotiation.cc:408] Connection header received
0504 14:07:28.456346 (+   148us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:28.456351 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:28.456398 (+    47us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:28.456489 (+    91us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:28.456910 (+   421us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.457797 (+   887us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.459133 (+  1336us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.459663 (+   530us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.459704 (+    41us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.459759 (+    55us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.459807 (+    48us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":52}
I20260504 14:07:28.460129 27987 leader_election.cc:304] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 33e0e159305a42cb82eabfd31e594bd3, d320fa750b4a4bb38f3f56d15216be57; no voters: 
I20260504 14:07:28.460465 28275 raft_consensus.cc:2804] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:07:28.460491 27914 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5ab71a6c0a3c44059d5fe1493e51408b" candidate_uuid: "d320fa750b4a4bb38f3f56d15216be57" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "97c217ef7f0e437e90bc46ff50b42155" is_pre_election: true
I20260504 14:07:28.460566 28275 raft_consensus.cc:493] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:07:28.460628 28275 raft_consensus.cc:3060] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:28.460791 27914 raft_consensus.cc:2468] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate d320fa750b4a4bb38f3f56d15216be57 in term 0.
I20260504 14:07:28.462045 28275 raft_consensus.cc:515] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } } peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } }
I20260504 14:07:28.462543 28275 leader_election.cc:290] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [CANDIDATE]: Term 1 election: Requested vote from peers 33e0e159305a42cb82eabfd31e594bd3 (127.25.254.195:38097), 97c217ef7f0e437e90bc46ff50b42155 (127.25.254.193:33467)
I20260504 14:07:28.462934 28197 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5ab71a6c0a3c44059d5fe1493e51408b" candidate_uuid: "d320fa750b4a4bb38f3f56d15216be57" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "33e0e159305a42cb82eabfd31e594bd3"
I20260504 14:07:28.463048 27914 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5ab71a6c0a3c44059d5fe1493e51408b" candidate_uuid: "d320fa750b4a4bb38f3f56d15216be57" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "97c217ef7f0e437e90bc46ff50b42155"
I20260504 14:07:28.463133 28197 raft_consensus.cc:3060] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:28.463212 27914 raft_consensus.cc:3060] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:28.464221 27914 raft_consensus.cc:2468] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate d320fa750b4a4bb38f3f56d15216be57 in term 1.
I20260504 14:07:28.464464 28197 raft_consensus.cc:2468] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate d320fa750b4a4bb38f3f56d15216be57 in term 1.
I20260504 14:07:28.464607 27988 leader_election.cc:304] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 97c217ef7f0e437e90bc46ff50b42155, d320fa750b4a4bb38f3f56d15216be57; no voters: 
I20260504 14:07:28.464823 28275 raft_consensus.cc:2804] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:28.465057 28275 raft_consensus.cc:697] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [term 1 LEADER]: Becoming Leader. State: Replica: d320fa750b4a4bb38f3f56d15216be57, State: Running, Role: LEADER
I20260504 14:07:28.465374 28275 consensus_queue.cc:237] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } } peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } }
I20260504 14:07:28.468588 27755 catalog_manager.cc:5671] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 reported cstate change: term changed from 0 to 1, leader changed from <none> to d320fa750b4a4bb38f3f56d15216be57 (127.25.254.194). New cstate: current_term: 1 leader_uuid: "d320fa750b4a4bb38f3f56d15216be57" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } health_report { overall_health: HEALTHY } } }
I20260504 14:07:28.497079 28266 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.494144 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:60092 (local address 127.25.254.194:46767)
0504 14:07:28.494297 (+   153us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:28.494301 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:28.494315 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:07:28.494483 (+   168us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:28.494486 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:28.494540 (+    54us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:28.494623 (+    83us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:07:28.494999 (+   376us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.495517 (+   518us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.496158 (+   641us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.496289 (+   131us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.496331 (+    42us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:07:28.496726 (+   395us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:07:28.496807 (+    81us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.496930 (+   123us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.496970 (+    40us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":31}
I20260504 14:07:28.502925 28197 raft_consensus.cc:1275] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3 [term 1 FOLLOWER]: Refusing update from remote peer d320fa750b4a4bb38f3f56d15216be57: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:07:28.502975 27914 raft_consensus.cc:1275] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155 [term 1 FOLLOWER]: Refusing update from remote peer d320fa750b4a4bb38f3f56d15216be57: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
W20260504 14:07:28.503664 28103 tablet.cc:2404] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20260504 14:07:28.503687 28275 consensus_queue.cc:1048] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [LEADER]: Connected to new peer: Peer: permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
W20260504 14:07:28.503835 28245 tablet.cc:2404] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20260504 14:07:28.503929 28277 consensus_queue.cc:1048] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [LEADER]: Connected to new peer: Peer: permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:07:28.511662 28281 mvcc.cc:204] Tried to move back new op lower bound from 7282293344260550656 to 7282293344116826112. Current Snapshot: MvccSnapshot[applied={T|T < 7282293344260550656}]
I20260504 14:07:28.513427 28282 mvcc.cc:204] Tried to move back new op lower bound from 7282293344260550656 to 7282293344116826112. Current Snapshot: MvccSnapshot[applied={T|T < 7282293344260550656 or (T in {7282293344260550656})}]
I20260504 14:07:28.520526 27755 catalog_manager.cc:2507] Servicing SoftDeleteTable request from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:43976:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:07:28.520674 27755 catalog_manager.cc:2755] Servicing DeleteTable request from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:43976:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:07:28.523203 27755 catalog_manager.cc:5958] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b: Sending DeleteTablet for 3 replicas of tablet 5ab71a6c0a3c44059d5fe1493e51408b
I20260504 14:07:28.524017 28178 tablet_service.cc:1558] Processing DeleteTablet for tablet 5ab71a6c0a3c44059d5fe1493e51408b with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:28 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:47418
I20260504 14:07:28.524124 28036 tablet_service.cc:1558] Processing DeleteTablet for tablet 5ab71a6c0a3c44059d5fe1493e51408b with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:28 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:60086
I20260504 14:07:28.524156 27894 tablet_service.cc:1558] Processing DeleteTablet for tablet 5ab71a6c0a3c44059d5fe1493e51408b with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:28 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:57820
I20260504 14:07:28.524572 28292 tablet_replica.cc:333] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155: stopping tablet replica
I20260504 14:07:28.524564 28291 tablet_replica.cc:333] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57: stopping tablet replica
I20260504 14:07:28.524824 28291 raft_consensus.cc:2243] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [term 1 LEADER]: Raft consensus shutting down.
I20260504 14:07:28.524824 28292 raft_consensus.cc:2243] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:07:28.525117 28292 raft_consensus.cc:2272] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:07:28.525146 28291 raft_consensus.cc:2272] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:07:28.525468 28290 tablet_replica.cc:333] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3: stopping tablet replica
I20260504 14:07:28.525702 28290 raft_consensus.cc:2243] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:07:28.525688 27755 catalog_manager.cc:2257] Servicing CreateTable request from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:43976:
name: "test-table"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "val"
    type: INT32
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
I20260504 14:07:28.525980 28290 raft_consensus.cc:2272] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3 [term 1 FOLLOWER]: Raft consensus is shut down!
W20260504 14:07:28.526119 27755 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-table in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260504 14:07:28.527334 28291 ts_tablet_manager.cc:1916] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:07:28.527334 28290 ts_tablet_manager.cc:1916] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:07:28.529747 28290 ts_tablet_manager.cc:1929] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:07:28.529747 28291 ts_tablet_manager.cc:1929] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:07:28.529836 28291 log.cc:1199] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-1/wal/wals/5ab71a6c0a3c44059d5fe1493e51408b
I20260504 14:07:28.530028 28290 log.cc:1199] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-2/wal/wals/5ab71a6c0a3c44059d5fe1493e51408b
I20260504 14:07:28.530144 28291 ts_tablet_manager.cc:1950] T 5ab71a6c0a3c44059d5fe1493e51408b P d320fa750b4a4bb38f3f56d15216be57: Deleting consensus metadata
I20260504 14:07:28.530465 28290 ts_tablet_manager.cc:1950] T 5ab71a6c0a3c44059d5fe1493e51408b P 33e0e159305a42cb82eabfd31e594bd3: Deleting consensus metadata
I20260504 14:07:28.531500 27742 catalog_manager.cc:5002] TS d320fa750b4a4bb38f3f56d15216be57 (127.25.254.194:46767): tablet 5ab71a6c0a3c44059d5fe1493e51408b (table test-table [id=adf8f18488374cbdad6dc720f4f36574]) successfully deleted
I20260504 14:07:28.531888 28292 ts_tablet_manager.cc:1916] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:07:28.531996 27894 tablet_service.cc:1511] Processing CreateTablet for tablet 26317babbb8e448bacaff108d452cb73 (DEFAULT_TABLE table=test-table [id=067f0e4f0c9a4107bdb19d0967e1e02e]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:28.532039 28036 tablet_service.cc:1511] Processing CreateTablet for tablet 26317babbb8e448bacaff108d452cb73 (DEFAULT_TABLE table=test-table [id=067f0e4f0c9a4107bdb19d0967e1e02e]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:28.532274 27894 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 26317babbb8e448bacaff108d452cb73. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:28.532282 28036 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 26317babbb8e448bacaff108d452cb73. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:28.534373 28292 ts_tablet_manager.cc:1929] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:07:28.534457 28292 log.cc:1199] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-0/wal/wals/5ab71a6c0a3c44059d5fe1493e51408b
I20260504 14:07:28.534601 28267 tablet_bootstrap.cc:492] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57: Bootstrap starting.
I20260504 14:07:28.534767 28292 ts_tablet_manager.cc:1950] T 5ab71a6c0a3c44059d5fe1493e51408b P 97c217ef7f0e437e90bc46ff50b42155: Deleting consensus metadata
I20260504 14:07:28.535315 28178 tablet_service.cc:1511] Processing CreateTablet for tablet 26317babbb8e448bacaff108d452cb73 (DEFAULT_TABLE table=test-table [id=067f0e4f0c9a4107bdb19d0967e1e02e]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:28.535602 28178 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 26317babbb8e448bacaff108d452cb73. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:28.535635 28267 tablet_bootstrap.cc:654] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:28.535984 27740 catalog_manager.cc:5002] TS 97c217ef7f0e437e90bc46ff50b42155 (127.25.254.193:33467): tablet 5ab71a6c0a3c44059d5fe1493e51408b (table test-table [id=adf8f18488374cbdad6dc720f4f36574]) successfully deleted
I20260504 14:07:28.536298 28268 tablet_bootstrap.cc:492] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155: Bootstrap starting.
I20260504 14:07:28.537043 28268 tablet_bootstrap.cc:654] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:28.537993 28269 tablet_bootstrap.cc:492] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3: Bootstrap starting.
I20260504 14:07:28.538273 28268 tablet_bootstrap.cc:492] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155: No bootstrap required, opened a new log
I20260504 14:07:28.538357 28268 ts_tablet_manager.cc:1403] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155: Time spent bootstrapping tablet: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:07:28.538918 28268 raft_consensus.cc:359] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } }
I20260504 14:07:28.539026 28268 raft_consensus.cc:385] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:28.539057 28268 raft_consensus.cc:740] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 97c217ef7f0e437e90bc46ff50b42155, State: Initialized, Role: FOLLOWER
I20260504 14:07:28.539184 28268 consensus_queue.cc:260] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } }
I20260504 14:07:28.539276 28269 tablet_bootstrap.cc:654] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:28.539497 28268 ts_tablet_manager.cc:1434] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:07:28.540642 28269 tablet_bootstrap.cc:492] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3: No bootstrap required, opened a new log
I20260504 14:07:28.540611 27739 catalog_manager.cc:5002] TS 33e0e159305a42cb82eabfd31e594bd3 (127.25.254.195:38097): tablet 5ab71a6c0a3c44059d5fe1493e51408b (table test-table [id=adf8f18488374cbdad6dc720f4f36574]) successfully deleted
I20260504 14:07:28.540722 28269 ts_tablet_manager.cc:1403] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3: Time spent bootstrapping tablet: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:07:28.541254 28269 raft_consensus.cc:359] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } }
I20260504 14:07:28.541370 28269 raft_consensus.cc:385] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:28.541407 28269 raft_consensus.cc:740] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 33e0e159305a42cb82eabfd31e594bd3, State: Initialized, Role: FOLLOWER
I20260504 14:07:28.541532 28269 consensus_queue.cc:260] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } }
I20260504 14:07:28.541690 28267 tablet_bootstrap.cc:492] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57: No bootstrap required, opened a new log
I20260504 14:07:28.541771 28267 ts_tablet_manager.cc:1403] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57: Time spent bootstrapping tablet: real 0.007s	user 0.002s	sys 0.000s
I20260504 14:07:28.542186 28269 ts_tablet_manager.cc:1434] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3: Time spent starting tablet: real 0.001s	user 0.002s	sys 0.000s
I20260504 14:07:28.542248 28267 raft_consensus.cc:359] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } }
I20260504 14:07:28.542358 28267 raft_consensus.cc:385] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:28.542384 28267 raft_consensus.cc:740] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: d320fa750b4a4bb38f3f56d15216be57, State: Initialized, Role: FOLLOWER
I20260504 14:07:28.542523 28267 consensus_queue.cc:260] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } }
I20260504 14:07:28.542809 28267 ts_tablet_manager.cc:1434] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:07:28.562202 28274 raft_consensus.cc:493] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:07:28.562357 28274 raft_consensus.cc:515] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } }
I20260504 14:07:28.563674 28274 leader_election.cc:290] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 97c217ef7f0e437e90bc46ff50b42155 (127.25.254.193:33467), d320fa750b4a4bb38f3f56d15216be57 (127.25.254.194:46767)
I20260504 14:07:28.566879 28137 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.563896 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:46767 (local address 127.25.254.195:44739)
0504 14:07:28.564115 (+   219us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:28.564132 (+    17us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:28.564224 (+    92us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:28.564571 (+   347us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:28.564575 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:28.564652 (+    77us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:28.564905 (+   253us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.564912 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.565741 (+   829us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.565747 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:28.566495 (+   748us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.566504 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.566611 (+   107us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.566630 (+    19us) client_negotiation.cc:770] Sending connection context
0504 14:07:28.566681 (+    51us) client_negotiation.cc:241] Negotiation successful
0504 14:07:28.566741 (+    60us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":120,"spinlock_wait_cycles":5504}
I20260504 14:07:28.567701 28266 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.564178 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:44739 (local address 127.25.254.194:46767)
0504 14:07:28.564328 (+   150us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:28.564333 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:28.564345 (+    12us) server_negotiation.cc:408] Connection header received
0504 14:07:28.564397 (+    52us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:28.564400 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:28.564450 (+    50us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:28.564538 (+    88us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:28.565029 (+   491us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.565620 (+   591us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.567154 (+  1534us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.567495 (+   341us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.567527 (+    32us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.567569 (+    42us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.567606 (+    37us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":62}
I20260504 14:07:28.568276 28056 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "26317babbb8e448bacaff108d452cb73" candidate_uuid: "33e0e159305a42cb82eabfd31e594bd3" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d320fa750b4a4bb38f3f56d15216be57" is_pre_election: true
I20260504 14:07:28.568521 28056 raft_consensus.cc:2468] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 33e0e159305a42cb82eabfd31e594bd3 in term 0.
I20260504 14:07:28.568578 28296 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.563896 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:33467 (local address 127.25.254.195:58887)
0504 14:07:28.564461 (+   565us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:28.564475 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:28.564621 (+   146us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:28.564909 (+   288us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:28.564912 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:28.564929 (+    17us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:28.565202 (+   273us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.565208 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.566265 (+  1057us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.566286 (+    21us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:28.568145 (+  1859us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.568154 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.568289 (+   135us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.568308 (+    19us) client_negotiation.cc:770] Sending connection context
0504 14:07:28.568362 (+    54us) client_negotiation.cc:241] Negotiation successful
0504 14:07:28.568430 (+    68us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":461,"mutex_wait_us":247,"spinlock_wait_cycles":4352,"thread_start_us":118,"threads_started":1}
I20260504 14:07:28.569022 28265 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.563967 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:58887 (local address 127.25.254.193:33467)
0504 14:07:28.564118 (+   151us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:28.564122 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:28.564537 (+   415us) server_negotiation.cc:408] Connection header received
0504 14:07:28.564761 (+   224us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:28.564764 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:28.564810 (+    46us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:28.564887 (+    77us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:28.565326 (+   439us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.566077 (+   751us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.568279 (+  2202us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.568771 (+   492us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.568805 (+    34us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.568857 (+    52us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.568909 (+    52us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":69}
I20260504 14:07:28.569002 28132 leader_election.cc:304] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 33e0e159305a42cb82eabfd31e594bd3, d320fa750b4a4bb38f3f56d15216be57; no voters: 
I20260504 14:07:28.569295 28274 raft_consensus.cc:2804] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:07:28.569378 28274 raft_consensus.cc:493] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:07:28.569420 28274 raft_consensus.cc:3060] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:28.569478 27914 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "26317babbb8e448bacaff108d452cb73" candidate_uuid: "33e0e159305a42cb82eabfd31e594bd3" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "97c217ef7f0e437e90bc46ff50b42155" is_pre_election: true
I20260504 14:07:28.569629 27914 raft_consensus.cc:2468] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 33e0e159305a42cb82eabfd31e594bd3 in term 0.
I20260504 14:07:28.570659 28274 raft_consensus.cc:515] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } }
I20260504 14:07:28.571074 28274 leader_election.cc:290] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [CANDIDATE]: Term 1 election: Requested vote from peers 97c217ef7f0e437e90bc46ff50b42155 (127.25.254.193:33467), d320fa750b4a4bb38f3f56d15216be57 (127.25.254.194:46767)
I20260504 14:07:28.571429 27914 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "26317babbb8e448bacaff108d452cb73" candidate_uuid: "33e0e159305a42cb82eabfd31e594bd3" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "97c217ef7f0e437e90bc46ff50b42155"
I20260504 14:07:28.571477 28056 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "26317babbb8e448bacaff108d452cb73" candidate_uuid: "33e0e159305a42cb82eabfd31e594bd3" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d320fa750b4a4bb38f3f56d15216be57"
I20260504 14:07:28.571552 27914 raft_consensus.cc:3060] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:28.571578 28056 raft_consensus.cc:3060] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:28.572646 28056 raft_consensus.cc:2468] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 33e0e159305a42cb82eabfd31e594bd3 in term 1.
I20260504 14:07:28.572654 27914 raft_consensus.cc:2468] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 33e0e159305a42cb82eabfd31e594bd3 in term 1.
I20260504 14:07:28.572947 28132 leader_election.cc:304] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 33e0e159305a42cb82eabfd31e594bd3, d320fa750b4a4bb38f3f56d15216be57; no voters: 
I20260504 14:07:28.573196 28274 raft_consensus.cc:2804] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:28.573467 28274 raft_consensus.cc:697] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [term 1 LEADER]: Becoming Leader. State: Replica: 33e0e159305a42cb82eabfd31e594bd3, State: Running, Role: LEADER
I20260504 14:07:28.573722 28274 consensus_queue.cc:237] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } }
I20260504 14:07:28.576606 27755 catalog_manager.cc:5671] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 reported cstate change: term changed from 0 to 1, leader changed from <none> to 33e0e159305a42cb82eabfd31e594bd3 (127.25.254.195). New cstate: current_term: 1 leader_uuid: "33e0e159305a42cb82eabfd31e594bd3" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } health_report { overall_health: HEALTHY } } }
May 04 14:07:28 dist-test-slave-2x32 krb5kdc[27707](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903647, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:28.621136 28304 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.612215 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:43237 (local address 127.0.0.1:43980)
0504 14:07:28.612538 (+   323us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:28.612554 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:28.612691 (+   137us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:28.612967 (+   276us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:28.612970 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:28.613556 (+   586us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:28.613823 (+   267us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.613831 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.614766 (+   935us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.614770 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:28.615396 (+   626us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.615406 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.615523 (+   117us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.616092 (+   569us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:28.616109 (+    17us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:28.617861 (+  1752us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:28.619633 (+  1772us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:28.619643 (+    10us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:28.619648 (+     5us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:28.619868 (+   220us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:28.620114 (+   246us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:28.620119 (+     5us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:28.620121 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:28.620176 (+    55us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:28.620532 (+   356us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:28.620540 (+     8us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:28.620647 (+   107us) client_negotiation.cc:770] Sending connection context
0504 14:07:28.620767 (+   120us) client_negotiation.cc:241] Negotiation successful
0504 14:07:28.620939 (+   172us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":227,"thread_start_us":123,"threads_started":1}
I20260504 14:07:28.621135 27804 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.612294 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:43980 (local address 127.25.254.254:43237)
0504 14:07:28.612458 (+   164us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:28.612462 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:28.612658 (+   196us) server_negotiation.cc:408] Connection header received
0504 14:07:28.612788 (+   130us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:28.612791 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:28.612834 (+    43us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:28.612904 (+    70us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:28.613992 (+  1088us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.614646 (+   654us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.615518 (+   872us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.615647 (+   129us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.618008 (+  2361us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:28.618035 (+    27us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:28.618038 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:28.618059 (+    21us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:28.619511 (+  1452us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:28.619971 (+   460us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:28.619976 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:28.619978 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:28.620024 (+    46us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:28.620264 (+   240us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:28.620268 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:28.620270 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:28.620422 (+   152us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:28.620529 (+   107us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.620811 (+   282us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.621003 (+   192us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"mutex_wait_us":90,"server-negotiator.queue_time_us":74}
I20260504 14:07:28.625671 27755 catalog_manager.cc:2257] Servicing CreateTable request from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:43980:
name: "kudu_system.kudu_transactions"
schema {
  columns {
    name: "txn_id"
    type: INT64
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "entry_type"
    type: INT8
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "identifier"
    type: STRING
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "metadata"
    type: STRING
    is_key: false
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
  rows: "<redacted>""\006\001\000\000\000\000\000\000\000\000\007\001@B\017\000\000\000\000\000"
  indirect_data: "<redacted>"""
}
partition_schema {
  range_schema {
    columns {
      name: "txn_id"
    }
  }
}
table_type: TXN_STATUS_TABLE
W20260504 14:07:28.626569 27755 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table kudu_system.kudu_transactions in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260504 14:07:28.632754 28036 tablet_service.cc:1511] Processing CreateTablet for tablet 5396ae24ef3f432e8e7551b30dd1b722 (TXN_STATUS_TABLE table=kudu_system.kudu_transactions [id=68afd245633f4f83882fc7ba3c35977b]), partition=RANGE (txn_id) PARTITION 0 <= VALUES < 1000000
I20260504 14:07:28.632694 27894 tablet_service.cc:1511] Processing CreateTablet for tablet 5396ae24ef3f432e8e7551b30dd1b722 (TXN_STATUS_TABLE table=kudu_system.kudu_transactions [id=68afd245633f4f83882fc7ba3c35977b]), partition=RANGE (txn_id) PARTITION 0 <= VALUES < 1000000
I20260504 14:07:28.633155 28036 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 5396ae24ef3f432e8e7551b30dd1b722. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:28.633155 27894 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 5396ae24ef3f432e8e7551b30dd1b722. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:28.635650 28267 tablet_bootstrap.cc:492] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57: Bootstrap starting.
I20260504 14:07:28.636264 28178 tablet_service.cc:1511] Processing CreateTablet for tablet 5396ae24ef3f432e8e7551b30dd1b722 (TXN_STATUS_TABLE table=kudu_system.kudu_transactions [id=68afd245633f4f83882fc7ba3c35977b]), partition=RANGE (txn_id) PARTITION 0 <= VALUES < 1000000
I20260504 14:07:28.636582 28267 tablet_bootstrap.cc:654] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:28.636626 28178 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 5396ae24ef3f432e8e7551b30dd1b722. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:28.637806 28267 tablet_bootstrap.cc:492] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57: No bootstrap required, opened a new log
I20260504 14:07:28.637924 28267 ts_tablet_manager.cc:1403] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57: Time spent bootstrapping tablet: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:07:28.638526 28267 raft_consensus.cc:359] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } }
I20260504 14:07:28.638657 28267 raft_consensus.cc:385] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:28.638697 28267 raft_consensus.cc:740] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: d320fa750b4a4bb38f3f56d15216be57, State: Initialized, Role: FOLLOWER
I20260504 14:07:28.638829 28267 consensus_queue.cc:260] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } }
I20260504 14:07:28.639554 28277 tablet_replica.cc:442] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57: TxnStatusTablet state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 0 committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } } }
I20260504 14:07:28.639704 28277 tablet_replica.cc:445] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:07:28.639978 28267 ts_tablet_manager.cc:1434] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:07:28.640008 28268 tablet_bootstrap.cc:492] T 5396ae24ef3f432e8e7551b30dd1b722 P 97c217ef7f0e437e90bc46ff50b42155: Bootstrap starting.
I20260504 14:07:28.641396 28268 tablet_bootstrap.cc:654] T 5396ae24ef3f432e8e7551b30dd1b722 P 97c217ef7f0e437e90bc46ff50b42155: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:28.642757 28268 tablet_bootstrap.cc:492] T 5396ae24ef3f432e8e7551b30dd1b722 P 97c217ef7f0e437e90bc46ff50b42155: No bootstrap required, opened a new log
I20260504 14:07:28.642871 28268 ts_tablet_manager.cc:1403] T 5396ae24ef3f432e8e7551b30dd1b722 P 97c217ef7f0e437e90bc46ff50b42155: Time spent bootstrapping tablet: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:07:28.643397 28268 raft_consensus.cc:359] T 5396ae24ef3f432e8e7551b30dd1b722 P 97c217ef7f0e437e90bc46ff50b42155 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } }
I20260504 14:07:28.643518 28268 raft_consensus.cc:385] T 5396ae24ef3f432e8e7551b30dd1b722 P 97c217ef7f0e437e90bc46ff50b42155 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:28.643558 28268 raft_consensus.cc:740] T 5396ae24ef3f432e8e7551b30dd1b722 P 97c217ef7f0e437e90bc46ff50b42155 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 97c217ef7f0e437e90bc46ff50b42155, State: Initialized, Role: FOLLOWER
I20260504 14:07:28.643666 28269 tablet_bootstrap.cc:492] T 5396ae24ef3f432e8e7551b30dd1b722 P 33e0e159305a42cb82eabfd31e594bd3: Bootstrap starting.
I20260504 14:07:28.643729 28268 consensus_queue.cc:260] T 5396ae24ef3f432e8e7551b30dd1b722 P 97c217ef7f0e437e90bc46ff50b42155 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } }
I20260504 14:07:28.644023 28268 ts_tablet_manager.cc:1434] T 5396ae24ef3f432e8e7551b30dd1b722 P 97c217ef7f0e437e90bc46ff50b42155: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:07:28.644721 28269 tablet_bootstrap.cc:654] T 5396ae24ef3f432e8e7551b30dd1b722 P 33e0e159305a42cb82eabfd31e594bd3: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:28.645414 28273 tablet_replica.cc:442] T 5396ae24ef3f432e8e7551b30dd1b722 P 97c217ef7f0e437e90bc46ff50b42155: TxnStatusTablet state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 0 committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } } }
I20260504 14:07:28.645560 28273 tablet_replica.cc:445] T 5396ae24ef3f432e8e7551b30dd1b722 P 97c217ef7f0e437e90bc46ff50b42155: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:07:28.646291 28269 tablet_bootstrap.cc:492] T 5396ae24ef3f432e8e7551b30dd1b722 P 33e0e159305a42cb82eabfd31e594bd3: No bootstrap required, opened a new log
I20260504 14:07:28.646386 28269 ts_tablet_manager.cc:1403] T 5396ae24ef3f432e8e7551b30dd1b722 P 33e0e159305a42cb82eabfd31e594bd3: Time spent bootstrapping tablet: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:07:28.646862 28269 raft_consensus.cc:359] T 5396ae24ef3f432e8e7551b30dd1b722 P 33e0e159305a42cb82eabfd31e594bd3 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } }
I20260504 14:07:28.646972 28269 raft_consensus.cc:385] T 5396ae24ef3f432e8e7551b30dd1b722 P 33e0e159305a42cb82eabfd31e594bd3 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:28.647003 28269 raft_consensus.cc:740] T 5396ae24ef3f432e8e7551b30dd1b722 P 33e0e159305a42cb82eabfd31e594bd3 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 33e0e159305a42cb82eabfd31e594bd3, State: Initialized, Role: FOLLOWER
I20260504 14:07:28.647145 28269 consensus_queue.cc:260] T 5396ae24ef3f432e8e7551b30dd1b722 P 33e0e159305a42cb82eabfd31e594bd3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } }
I20260504 14:07:28.648242 28274 tablet_replica.cc:442] T 5396ae24ef3f432e8e7551b30dd1b722 P 33e0e159305a42cb82eabfd31e594bd3: TxnStatusTablet state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 0 committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } } }
I20260504 14:07:28.648365 28274 tablet_replica.cc:445] T 5396ae24ef3f432e8e7551b30dd1b722 P 33e0e159305a42cb82eabfd31e594bd3: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:07:28.648658 28269 ts_tablet_manager.cc:1434] T 5396ae24ef3f432e8e7551b30dd1b722 P 33e0e159305a42cb82eabfd31e594bd3: Time spent starting tablet: real 0.002s	user 0.001s	sys 0.000s
I20260504 14:07:28.655244 28277 raft_consensus.cc:493] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:07:28.655423 28277 raft_consensus.cc:515] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } }
I20260504 14:07:28.655889 28277 leader_election.cc:290] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 97c217ef7f0e437e90bc46ff50b42155 (127.25.254.193:33467), 33e0e159305a42cb82eabfd31e594bd3 (127.25.254.195:38097)
I20260504 14:07:28.656363 27914 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5396ae24ef3f432e8e7551b30dd1b722" candidate_uuid: "d320fa750b4a4bb38f3f56d15216be57" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "97c217ef7f0e437e90bc46ff50b42155" is_pre_election: true
I20260504 14:07:28.656505 27914 raft_consensus.cc:2468] T 5396ae24ef3f432e8e7551b30dd1b722 P 97c217ef7f0e437e90bc46ff50b42155 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate d320fa750b4a4bb38f3f56d15216be57 in term 0.
I20260504 14:07:28.656796 28197 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5396ae24ef3f432e8e7551b30dd1b722" candidate_uuid: "d320fa750b4a4bb38f3f56d15216be57" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "33e0e159305a42cb82eabfd31e594bd3" is_pre_election: true
I20260504 14:07:28.656836 27988 leader_election.cc:304] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 97c217ef7f0e437e90bc46ff50b42155, d320fa750b4a4bb38f3f56d15216be57; no voters: 
I20260504 14:07:28.656957 28197 raft_consensus.cc:2468] T 5396ae24ef3f432e8e7551b30dd1b722 P 33e0e159305a42cb82eabfd31e594bd3 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate d320fa750b4a4bb38f3f56d15216be57 in term 0.
I20260504 14:07:28.657158 28277 raft_consensus.cc:2804] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:07:28.657220 28277 raft_consensus.cc:493] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:07:28.657281 28277 raft_consensus.cc:3060] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:28.658460 28277 raft_consensus.cc:515] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } }
I20260504 14:07:28.658847 28277 leader_election.cc:290] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 [CANDIDATE]: Term 1 election: Requested vote from peers 97c217ef7f0e437e90bc46ff50b42155 (127.25.254.193:33467), 33e0e159305a42cb82eabfd31e594bd3 (127.25.254.195:38097)
I20260504 14:07:28.659301 28197 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5396ae24ef3f432e8e7551b30dd1b722" candidate_uuid: "d320fa750b4a4bb38f3f56d15216be57" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "33e0e159305a42cb82eabfd31e594bd3"
I20260504 14:07:28.659282 27914 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5396ae24ef3f432e8e7551b30dd1b722" candidate_uuid: "d320fa750b4a4bb38f3f56d15216be57" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "97c217ef7f0e437e90bc46ff50b42155"
I20260504 14:07:28.659413 27914 raft_consensus.cc:3060] T 5396ae24ef3f432e8e7551b30dd1b722 P 97c217ef7f0e437e90bc46ff50b42155 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:28.659413 28197 raft_consensus.cc:3060] T 5396ae24ef3f432e8e7551b30dd1b722 P 33e0e159305a42cb82eabfd31e594bd3 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:28.660957 28197 raft_consensus.cc:2468] T 5396ae24ef3f432e8e7551b30dd1b722 P 33e0e159305a42cb82eabfd31e594bd3 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate d320fa750b4a4bb38f3f56d15216be57 in term 1.
I20260504 14:07:28.660970 27914 raft_consensus.cc:2468] T 5396ae24ef3f432e8e7551b30dd1b722 P 97c217ef7f0e437e90bc46ff50b42155 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate d320fa750b4a4bb38f3f56d15216be57 in term 1.
I20260504 14:07:28.661442 27987 leader_election.cc:304] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 33e0e159305a42cb82eabfd31e594bd3, d320fa750b4a4bb38f3f56d15216be57; no voters: 
I20260504 14:07:28.661613 28277 raft_consensus.cc:2804] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:28.661743 28277 raft_consensus.cc:697] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 [term 1 LEADER]: Becoming Leader. State: Replica: d320fa750b4a4bb38f3f56d15216be57, State: Running, Role: LEADER
I20260504 14:07:28.661878 28277 consensus_queue.cc:237] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } }
I20260504 14:07:28.662669 28286 tablet_replica.cc:442] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57: TxnStatusTablet state changed. Reason: New leader d320fa750b4a4bb38f3f56d15216be57. Latest consensus state: current_term: 1 leader_uuid: "d320fa750b4a4bb38f3f56d15216be57" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } } }
I20260504 14:07:28.662772 28286 tablet_replica.cc:445] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57: This TxnStatusTablet replica's current role is: LEADER
I20260504 14:07:28.663873 28309 txn_status_manager.cc:874] Waiting until node catch up with all replicated operations in previous term...
I20260504 14:07:28.664029 28309 txn_status_manager.cc:930] Loading transaction status metadata into memory...
I20260504 14:07:28.664093 27755 catalog_manager.cc:5671] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 reported cstate change: term changed from 0 to 1, leader changed from <none> to d320fa750b4a4bb38f3f56d15216be57 (127.25.254.194). New cstate: current_term: 1 leader_uuid: "d320fa750b4a4bb38f3f56d15216be57" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } health_report { overall_health: UNKNOWN } } }
I20260504 14:07:28.677111 28304 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.673895 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:46767 (local address 127.0.0.1:60108)
0504 14:07:28.674059 (+   164us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:28.674073 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:28.674227 (+   154us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:28.674635 (+   408us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:28.674641 (+     6us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:28.674653 (+    12us) client_negotiation.cc:190] Negotiated authn=TOKEN
0504 14:07:28.674858 (+   205us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.674864 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.675846 (+   982us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.675849 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:28.676270 (+   421us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.676277 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.676380 (+   103us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.676416 (+    36us) client_negotiation.cc:253] Sending TOKEN_EXCHANGE NegotiatePB request
0504 14:07:28.676871 (+   455us) client_negotiation.cc:272] Received TOKEN_EXCHANGE NegotiatePB response
0504 14:07:28.676881 (+    10us) client_negotiation.cc:770] Sending connection context
0504 14:07:28.676937 (+    56us) client_negotiation.cc:241] Negotiation successful
0504 14:07:28.676995 (+    58us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":44}
I20260504 14:07:28.677163 28266 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.674003 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:60108 (local address 127.25.254.194:46767)
0504 14:07:28.674140 (+   137us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:28.674145 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:28.674187 (+    42us) server_negotiation.cc:408] Connection header received
0504 14:07:28.674352 (+   165us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:28.674356 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:28.674416 (+    60us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:28.674530 (+   114us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:07:28.674971 (+   441us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.675683 (+   712us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.676412 (+   729us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.676608 (+   196us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.676661 (+    53us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:07:28.676770 (+   109us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:07:28.676855 (+    85us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.676983 (+   128us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.677031 (+    48us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":57}
I20260504 14:07:28.679957 27914 raft_consensus.cc:1275] T 5396ae24ef3f432e8e7551b30dd1b722 P 97c217ef7f0e437e90bc46ff50b42155 [term 1 FOLLOWER]: Refusing update from remote peer d320fa750b4a4bb38f3f56d15216be57: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:07:28.680019 28197 raft_consensus.cc:1275] T 5396ae24ef3f432e8e7551b30dd1b722 P 33e0e159305a42cb82eabfd31e594bd3 [term 1 FOLLOWER]: Refusing update from remote peer d320fa750b4a4bb38f3f56d15216be57: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:07:28.680428 28277 consensus_queue.cc:1048] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 [LEADER]: Connected to new peer: Peer: permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:07:28.680617 28286 consensus_queue.cc:1048] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57 [LEADER]: Connected to new peer: Peer: permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:07:28.682075 28273 tablet_replica.cc:442] T 5396ae24ef3f432e8e7551b30dd1b722 P 97c217ef7f0e437e90bc46ff50b42155: TxnStatusTablet state changed. Reason: New leader d320fa750b4a4bb38f3f56d15216be57. Latest consensus state: current_term: 1 leader_uuid: "d320fa750b4a4bb38f3f56d15216be57" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } } }
I20260504 14:07:28.682216 28273 tablet_replica.cc:445] T 5396ae24ef3f432e8e7551b30dd1b722 P 97c217ef7f0e437e90bc46ff50b42155: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:07:28.683897 28286 tablet_replica.cc:442] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57: TxnStatusTablet state changed. Reason: Peer health change. Latest consensus state: current_term: 1 leader_uuid: "d320fa750b4a4bb38f3f56d15216be57" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } } }
I20260504 14:07:28.684011 28286 tablet_replica.cc:445] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57: This TxnStatusTablet replica's current role is: LEADER
I20260504 14:07:28.684859 28280 mvcc.cc:204] Tried to move back new op lower bound from 7282293344988614656 to 7282293344920743936. Current Snapshot: MvccSnapshot[applied={T|T < 7282293344988614656 or (T in {7282293344988614656})}]
I20260504 14:07:28.687033 28274 tablet_replica.cc:442] T 5396ae24ef3f432e8e7551b30dd1b722 P 33e0e159305a42cb82eabfd31e594bd3: TxnStatusTablet state changed. Reason: New leader d320fa750b4a4bb38f3f56d15216be57. Latest consensus state: current_term: 1 leader_uuid: "d320fa750b4a4bb38f3f56d15216be57" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } } }
I20260504 14:07:28.687161 28274 tablet_replica.cc:445] T 5396ae24ef3f432e8e7551b30dd1b722 P 33e0e159305a42cb82eabfd31e594bd3: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:07:28.687753 28273 tablet_replica.cc:442] T 5396ae24ef3f432e8e7551b30dd1b722 P 97c217ef7f0e437e90bc46ff50b42155: TxnStatusTablet state changed. Reason: Replicated consensus-only round. Latest consensus state: current_term: 1 leader_uuid: "d320fa750b4a4bb38f3f56d15216be57" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } } }
I20260504 14:07:28.687871 28273 tablet_replica.cc:445] T 5396ae24ef3f432e8e7551b30dd1b722 P 97c217ef7f0e437e90bc46ff50b42155: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:07:28.688377 28275 tablet_replica.cc:442] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57: TxnStatusTablet state changed. Reason: Peer health change. Latest consensus state: current_term: 1 leader_uuid: "d320fa750b4a4bb38f3f56d15216be57" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } } }
I20260504 14:07:28.688483 28275 tablet_replica.cc:445] T 5396ae24ef3f432e8e7551b30dd1b722 P d320fa750b4a4bb38f3f56d15216be57: This TxnStatusTablet replica's current role is: LEADER
I20260504 14:07:28.689334 28274 tablet_replica.cc:442] T 5396ae24ef3f432e8e7551b30dd1b722 P 33e0e159305a42cb82eabfd31e594bd3: TxnStatusTablet state changed. Reason: Replicated consensus-only round. Latest consensus state: current_term: 1 leader_uuid: "d320fa750b4a4bb38f3f56d15216be57" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 } } peers { permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 } } peers { permanent_uuid: "33e0e159305a42cb82eabfd31e594bd3" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 38097 } } }
I20260504 14:07:28.689445 28274 tablet_replica.cc:445] T 5396ae24ef3f432e8e7551b30dd1b722 P 33e0e159305a42cb82eabfd31e594bd3: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:07:28.694628 28262 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.690644 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:47428 (local address 127.25.254.195:38097)
0504 14:07:28.690807 (+   163us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:28.690812 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:28.690826 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:07:28.690907 (+    81us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:28.690910 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:28.690976 (+    66us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:28.691078 (+   102us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:07:28.691624 (+   546us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.692360 (+   736us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.693110 (+   750us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.693343 (+   233us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.693500 (+   157us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:07:28.694037 (+   537us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:07:28.694228 (+   191us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.694422 (+   194us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.694476 (+    54us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":76}
I20260504 14:07:28.705896 28266 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.702558 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:58219 (local address 127.25.254.194:46767)
0504 14:07:28.702733 (+   175us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:28.702739 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:28.702756 (+    17us) server_negotiation.cc:408] Connection header received
0504 14:07:28.702835 (+    79us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:28.702839 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:28.702919 (+    80us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:28.702996 (+    77us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:07:28.703535 (+   539us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.704303 (+   768us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.705089 (+   786us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.705286 (+   197us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.705333 (+    47us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:07:28.705419 (+    86us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:07:28.705504 (+    85us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.705721 (+   217us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.705764 (+    43us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":59}
I20260504 14:07:28.705910 28250 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.702420 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:46767 (local address 127.25.254.195:58219)
0504 14:07:28.702603 (+   183us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:28.702620 (+    17us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:28.702713 (+    93us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:28.703050 (+   337us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:28.703054 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:28.703066 (+    12us) client_negotiation.cc:190] Negotiated authn=TOKEN
0504 14:07:28.703393 (+   327us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.703400 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.704436 (+  1036us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.704440 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:28.704955 (+   515us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.704963 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.705068 (+   105us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.705104 (+    36us) client_negotiation.cc:253] Sending TOKEN_EXCHANGE NegotiatePB request
0504 14:07:28.705523 (+   419us) client_negotiation.cc:272] Received TOKEN_EXCHANGE NegotiatePB response
0504 14:07:28.705529 (+     6us) client_negotiation.cc:770] Sending connection context
0504 14:07:28.705683 (+   154us) client_negotiation.cc:241] Negotiation successful
0504 14:07:28.705753 (+    70us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":48}
I20260504 14:07:28.713013 27914 raft_consensus.cc:1275] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155 [term 1 FOLLOWER]: Refusing update from remote peer 33e0e159305a42cb82eabfd31e594bd3: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:07:28.713129 28056 raft_consensus.cc:1275] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57 [term 1 FOLLOWER]: Refusing update from remote peer 33e0e159305a42cb82eabfd31e594bd3: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:07:28.713722 28274 consensus_queue.cc:1048] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [LEADER]: Connected to new peer: Peer: permanent_uuid: "d320fa750b4a4bb38f3f56d15216be57" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 46767 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:07:28.713932 28297 consensus_queue.cc:1048] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [LEADER]: Connected to new peer: Peer: permanent_uuid: "97c217ef7f0e437e90bc46ff50b42155" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33467 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:07:28.738049 28262 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.734895 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:36079 (local address 127.25.254.195:38097)
0504 14:07:28.735020 (+   125us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:28.735027 (+     7us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:28.735041 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:07:28.735224 (+   183us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:28.735229 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:28.735288 (+    59us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:28.735369 (+    81us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:07:28.735749 (+   380us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.736506 (+   757us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.737317 (+   811us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.737517 (+   200us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.737559 (+    42us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:07:28.737649 (+    90us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:07:28.737719 (+    70us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.737875 (+   156us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.737938 (+    63us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":45}
I20260504 14:07:28.738101 28319 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.734558 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:38097 (local address 127.25.254.194:36079)
0504 14:07:28.734968 (+   410us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:28.734985 (+    17us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:28.735095 (+   110us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:28.735404 (+   309us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:28.735407 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:28.735417 (+    10us) client_negotiation.cc:190] Negotiated authn=TOKEN
0504 14:07:28.735639 (+   222us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.735645 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.736648 (+  1003us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.736651 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:28.737166 (+   515us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:28.737174 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.737390 (+   216us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.737442 (+    52us) client_negotiation.cc:253] Sending TOKEN_EXCHANGE NegotiatePB request
0504 14:07:28.737762 (+   320us) client_negotiation.cc:272] Received TOKEN_EXCHANGE NegotiatePB response
0504 14:07:28.737769 (+     7us) client_negotiation.cc:770] Sending connection context
0504 14:07:28.737836 (+    67us) client_negotiation.cc:241] Negotiation successful
0504 14:07:28.737909 (+    73us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":203,"thread_start_us":135,"threads_started":1}
I20260504 14:07:28.769876 27755 catalog_manager.cc:2507] Servicing SoftDeleteTable request from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:43976:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:07:28.770080 27755 catalog_manager.cc:2755] Servicing DeleteTable request from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:43976:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:07:28.772361 27755 catalog_manager.cc:5958] T 00000000000000000000000000000000 P c565044b9e4a4b8094dc715613b2df4b: Sending DeleteTablet for 3 replicas of tablet 26317babbb8e448bacaff108d452cb73
I20260504 14:07:28.772984 27894 tablet_service.cc:1558] Processing DeleteTablet for tablet 26317babbb8e448bacaff108d452cb73 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:28 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:57820
I20260504 14:07:28.773182 28292 tablet_replica.cc:333] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155: stopping tablet replica
I20260504 14:07:28.773172 28036 tablet_service.cc:1558] Processing DeleteTablet for tablet 26317babbb8e448bacaff108d452cb73 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:28 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:60086
I20260504 14:07:28.773313 28292 raft_consensus.cc:2243] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:07:28.773434 28292 raft_consensus.cc:2272] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:07:28.773497 28178 tablet_service.cc:1558] Processing DeleteTablet for tablet 26317babbb8e448bacaff108d452cb73 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:28 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:47418
I20260504 14:07:28.773604 28291 tablet_replica.cc:333] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57: stopping tablet replica
I20260504 14:07:28.773659 28290 tablet_replica.cc:333] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3: stopping tablet replica
I20260504 14:07:28.773715 28291 raft_consensus.cc:2243] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:07:28.773749 28290 raft_consensus.cc:2243] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [term 1 LEADER]: Raft consensus shutting down.
I20260504 14:07:28.773818 28291 raft_consensus.cc:2272] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:07:28.774052 28290 raft_consensus.cc:2272] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:07:28.775276 28291 ts_tablet_manager.cc:1916] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:07:28.777793 28291 ts_tablet_manager.cc:1929] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.5
I20260504 14:07:28.777858 28291 log.cc:1199] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-1/wal/wals/26317babbb8e448bacaff108d452cb73
I20260504 14:07:28.778100 28291 ts_tablet_manager.cc:1950] T 26317babbb8e448bacaff108d452cb73 P d320fa750b4a4bb38f3f56d15216be57: Deleting consensus metadata
I20260504 14:07:28.778820 28292 ts_tablet_manager.cc:1916] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:07:28.781286 28292 ts_tablet_manager.cc:1929] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.5
I20260504 14:07:28.781354 28292 log.cc:1199] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-0/wal/wals/26317babbb8e448bacaff108d452cb73
I20260504 14:07:28.781591 28292 ts_tablet_manager.cc:1950] T 26317babbb8e448bacaff108d452cb73 P 97c217ef7f0e437e90bc46ff50b42155: Deleting consensus metadata
I20260504 14:07:28.782701 28290 ts_tablet_manager.cc:1916] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:07:28.785046 28290 ts_tablet_manager.cc:1929] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.5
I20260504 14:07:28.785115 28290 log.cc:1199] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SmokeTestAsAuthorizedUser.1777903638260922-26619-0/minicluster-data/ts-2/wal/wals/26317babbb8e448bacaff108d452cb73
I20260504 14:07:28.785347 28290 ts_tablet_manager.cc:1950] T 26317babbb8e448bacaff108d452cb73 P 33e0e159305a42cb82eabfd31e594bd3: Deleting consensus metadata
I20260504 14:07:28.786228 27739 catalog_manager.cc:5002] TS 33e0e159305a42cb82eabfd31e594bd3 (127.25.254.195:38097): tablet 26317babbb8e448bacaff108d452cb73 (table test-table [id=067f0e4f0c9a4107bdb19d0967e1e02e]) successfully deleted
I20260504 14:07:28.788806 27740 catalog_manager.cc:5002] TS 97c217ef7f0e437e90bc46ff50b42155 (127.25.254.193:33467): tablet 26317babbb8e448bacaff108d452cb73 (table test-table [id=067f0e4f0c9a4107bdb19d0967e1e02e]) successfully deleted
I20260504 14:07:28.789522 27742 catalog_manager.cc:5002] TS d320fa750b4a4bb38f3f56d15216be57 (127.25.254.194:46767): tablet 26317babbb8e448bacaff108d452cb73 (table test-table [id=067f0e4f0c9a4107bdb19d0967e1e02e]) successfully deleted
May 04 14:07:28 dist-test-slave-2x32 krb5kdc[27707](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903648, etypes {rep=17 tkt=17 ses=17}, test-user@KRBTEST.COM for kudu/127.25.254.193@KRBTEST.COM
I20260504 14:07:28.797386 28265 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.788033 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:57836 (local address 127.25.254.193:33467)
0504 14:07:28.788176 (+   143us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:28.788182 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:28.788761 (+   579us) server_negotiation.cc:408] Connection header received
0504 14:07:28.788957 (+   196us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:28.788960 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:28.789018 (+    58us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:28.789068 (+    50us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:28.790246 (+  1178us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.790751 (+   505us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.791439 (+   688us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.791652 (+   213us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.793835 (+  2183us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:28.793863 (+    28us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:28.793875 (+    12us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:28.793899 (+    24us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:28.795675 (+  1776us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:28.796121 (+   446us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:28.796126 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:28.796127 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:28.796174 (+    47us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:28.796415 (+   241us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:28.796418 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:28.796420 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:28.796596 (+   176us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:28.796792 (+   196us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.796989 (+   197us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.797114 (+   125us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":54}
W20260504 14:07:28.797821 27954 server_base.cc:1143] Unauthorized access attempt to method kudu.server.GenericService.SetFlag from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:57836
I20260504 14:07:28.806866 27804 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:28.799651 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:43986 (local address 127.25.254.254:43237)
0504 14:07:28.799794 (+   143us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:28.799798 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:28.799953 (+   155us) server_negotiation.cc:408] Connection header received
0504 14:07:28.800149 (+   196us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:28.800152 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:28.800200 (+    48us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:28.800295 (+    95us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:28.801035 (+   740us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.801533 (+   498us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:28.802300 (+   767us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:28.802891 (+   591us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:28.803600 (+   709us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:28.803622 (+    22us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:28.803633 (+    11us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:28.803660 (+    27us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:28.805070 (+  1410us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:28.805542 (+   472us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:28.805548 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:28.805550 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:28.805605 (+    55us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:28.805956 (+   351us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:28.805960 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:28.805962 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:28.806107 (+   145us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:28.806309 (+   202us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:28.806552 (+   243us) server_negotiation.cc:300] Negotiation successful
0504 14:07:28.806671 (+   119us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":57}
W20260504 14:07:28.807260 27755 server_base.cc:1143] Unauthorized access attempt to method kudu.master.MasterService.TSHeartbeat from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:43986
I20260504 14:07:28.808650 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 27829
I20260504 14:07:28.816342 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 27971
I20260504 14:07:28.825103 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 28113
I20260504 14:07:28.831776 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 27723
2026-05-04T14:07:28Z chronyd exiting
[       OK ] SecurityITest.SmokeTestAsAuthorizedUser (3711 ms)
[ RUN      ] SecurityITest.TxnSmokeWithDifferentUserTypes
Loading random data
Initializing database '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/principal' for realm 'KRBTEST.COM',
master key name 'K/M@KRBTEST.COM'
May 04 14:07:28 dist-test-slave-2x32 krb5kdc[28329](info): setting up network...
krb5kdc: setsockopt(10,IPV6_V6ONLY,1) worked
May 04 14:07:28 dist-test-slave-2x32 krb5kdc[28329](info): set up 2 sockets
May 04 14:07:28 dist-test-slave-2x32 krb5kdc[28329](info): commencing operation
krb5kdc: starting...
W20260504 14:07:30.872547 26619 mini_kdc.cc:121] Time spent starting KDC: real 2.016s	user 0.000s	sys 0.007s
WARNING: no policy specified for test-admin@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-admin@KRBTEST.COM" created.
WARNING: no policy specified for test-user@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-user@KRBTEST.COM" created.
WARNING: no policy specified for joe-interloper@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "joe-interloper@KRBTEST.COM" created.
Authenticating as principal slave/admin@KRBTEST.COM with password.
Entry for principal test-user with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/test-user.keytab.
Entry for principal test-user with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/test-user.keytab.
May 04 14:07:30 dist-test-slave-2x32 krb5kdc[28329](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903650, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
2026-05-04T14:07:30Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:07:30Z Disabled control of system clock
WARNING: no policy specified for kudu/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:31.023360 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:44627
--webserver_interface=127.25.254.254
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:39763
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:44627
--encrypt_data_at_rest=true
--rpc_trace_negotiation
--txn_manager_enabled=true with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:31.131397 28345 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:31.131743 28345 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:31.131858 28345 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:31.135489 28345 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:07:31.135617 28345 flags.cc:432] Enabled experimental flag: --txn_manager_enabled=true
W20260504 14:07:31.135654 28345 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:31.135708 28345 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:31.135769 28345 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:07:31.135824 28345 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:07:31.140637 28345 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:39763
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:44627
--txn_manager_enabled=true
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:44627
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.28345
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:31.141878 28345 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:31.142820 28345 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:31.148869 28353 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:31.148886 28350 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:31.148882 28351 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:31.149106 28345 server_base.cc:1061] running on GCE node
I20260504 14:07:31.149748 28345 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:31.150914 28345 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:31.152128 28345 hybrid_clock.cc:648] HybridClock initialized: now 1777903651152088 us; error 57 us; skew 500 ppm
May 04 14:07:31 dist-test-slave-2x32 krb5kdc[28329](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903651, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:31.155292 28345 init.cc:377] Logged in from keytab as kudu/127.25.254.254@KRBTEST.COM (short username kudu)
I20260504 14:07:31.156527 28345 webserver.cc:492] Webserver started at http://127.25.254.254:43241/ using document root <none> and password file <none>
I20260504 14:07:31.157119 28345 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:31.157196 28345 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:31.157423 28345 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:31.159214 28345 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "4fbcd8298649420c921777c7ea53a991"
format_stamp: "Formatted at 2026-05-04 14:07:31 on dist-test-slave-2x32"
server_key: "69cf20b0edfeface3b50ff25d087c740"
server_key_iv: "4d79521319fc1ea199baae340ffdda15"
server_key_version: "encryptionkey@0"
I20260504 14:07:31.159730 28345 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "4fbcd8298649420c921777c7ea53a991"
format_stamp: "Formatted at 2026-05-04 14:07:31 on dist-test-slave-2x32"
server_key: "69cf20b0edfeface3b50ff25d087c740"
server_key_iv: "4d79521319fc1ea199baae340ffdda15"
server_key_version: "encryptionkey@0"
I20260504 14:07:31.163220 28345 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.000s	sys 0.004s
I20260504 14:07:31.165716 28360 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:31.166894 28345 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:07:31.167058 28345 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "4fbcd8298649420c921777c7ea53a991"
format_stamp: "Formatted at 2026-05-04 14:07:31 on dist-test-slave-2x32"
server_key: "69cf20b0edfeface3b50ff25d087c740"
server_key_iv: "4d79521319fc1ea199baae340ffdda15"
server_key_version: "encryptionkey@0"
I20260504 14:07:31.167186 28345 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:31.178898 28345 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:31.182126 28345 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:31.182458 28345 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:31.191432 28345 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:44627
I20260504 14:07:31.191431 28422 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:44627 every 8 connection(s)
I20260504 14:07:31.192555 28345 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:07:31.196003 28423 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:31.199523 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 28345
I20260504 14:07:31.199648 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:07:31.199900 26619 external_mini_cluster.cc:1468] Setting key 43e50a9ac7d4d0e4117ad50ffaaded6a
I20260504 14:07:31.201712 28423 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991: Bootstrap starting.
I20260504 14:07:31.203919 28423 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:31.204658 28423 log.cc:826] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:31.206748 28423 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991: No bootstrap required, opened a new log
May 04 14:07:31 dist-test-slave-2x32 krb5kdc[28329](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903650, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:31.209970 28423 raft_consensus.cc:359] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4fbcd8298649420c921777c7ea53a991" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44627 } }
I20260504 14:07:31.210211 28423 raft_consensus.cc:385] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:31.210302 28423 raft_consensus.cc:740] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4fbcd8298649420c921777c7ea53a991, State: Initialized, Role: FOLLOWER
I20260504 14:07:31.210850 28423 consensus_queue.cc:260] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4fbcd8298649420c921777c7ea53a991" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44627 } }
I20260504 14:07:31.211025 28423 raft_consensus.cc:399] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:07:31.211098 28423 raft_consensus.cc:493] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:07:31.211201 28423 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:31.212292 28423 raft_consensus.cc:515] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4fbcd8298649420c921777c7ea53a991" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44627 } }
I20260504 14:07:31.212689 28423 leader_election.cc:304] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 4fbcd8298649420c921777c7ea53a991; no voters: 
I20260504 14:07:31.213001 28423 leader_election.cc:290] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:07:31.213116 28428 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:31.213338 28428 raft_consensus.cc:697] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991 [term 1 LEADER]: Becoming Leader. State: Replica: 4fbcd8298649420c921777c7ea53a991, State: Running, Role: LEADER
I20260504 14:07:31.213631 28428 consensus_queue.cc:237] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4fbcd8298649420c921777c7ea53a991" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44627 } }
I20260504 14:07:31.213750 28426 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:31.201316 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:50006 (local address 127.25.254.254:44627)
0504 14:07:31.201808 (+   492us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:31.201818 (+    10us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:31.201882 (+    64us) server_negotiation.cc:408] Connection header received
0504 14:07:31.202460 (+   578us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:31.202477 (+    17us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:31.202763 (+   286us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:31.203078 (+   315us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:31.203959 (+   881us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.204761 (+   802us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:31.205460 (+   699us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.205778 (+   318us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:31.208278 (+  2500us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:31.208298 (+    20us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:31.208310 (+    12us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:31.208339 (+    29us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:31.210801 (+  2462us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:31.211307 (+   506us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:31.211313 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:31.211319 (+     6us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:31.211406 (+    87us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:31.211681 (+   275us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:31.211683 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:31.211684 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:31.211981 (+   297us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:31.212137 (+   156us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:31.212499 (+   362us) server_negotiation.cc:300] Negotiation successful
0504 14:07:31.212805 (+   306us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":352,"thread_start_us":158,"threads_started":1}
I20260504 14:07:31.214430 28423 sys_catalog.cc:565] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991 [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:07:31.215178 28429 sys_catalog.cc:455] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 4fbcd8298649420c921777c7ea53a991. Latest consensus state: current_term: 1 leader_uuid: "4fbcd8298649420c921777c7ea53a991" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4fbcd8298649420c921777c7ea53a991" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44627 } } }
I20260504 14:07:31.215308 28429 sys_catalog.cc:458] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991 [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:31.215632 28430 sys_catalog.cc:455] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "4fbcd8298649420c921777c7ea53a991" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4fbcd8298649420c921777c7ea53a991" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44627 } } }
I20260504 14:07:31.215718 28430 sys_catalog.cc:458] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991 [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:31.215997 28435 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:07:31.219091 28435 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:07:31.224334 28435 catalog_manager.cc:1357] Generated new cluster ID: 67d6e9fa0cf34c948133056bfb82763b
I20260504 14:07:31.224427 28435 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:07:31.244998 28435 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:07:31.246054 28435 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:07:31.255275 28435 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991: Generated new TSK 0
I20260504 14:07:31.256268 28435 catalog_manager.cc:1524] Initializing in-progress tserver states...
WARNING: no policy specified for kudu/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:31.325834 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:0
--local_ip_for_outbound_sockets=127.25.254.193
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:44627
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:39763
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--enable_txn_system_client_init=true
--txn_keepalive_interval_ms=500 with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:31.430778 28451 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:31.431061 28451 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:31.431162 28451 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:31.434691 28451 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:31.434795 28451 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:31.434885 28451 flags.cc:432] Enabled experimental flag: --txn_keepalive_interval_ms=500
W20260504 14:07:31.434930 28451 flags.cc:432] Enabled experimental flag: --enable_txn_system_client_init=true
W20260504 14:07:31.435010 28451 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:07:31.439549 28451 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:39763
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=0
--webserver_require_spnego=true
--txn_keepalive_interval_ms=500
--enable_txn_system_client_init=true
--tserver_master_addrs=127.25.254.254:44627
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.28451
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:31.440716 28451 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:31.441599 28451 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:31.448323 28457 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:31.448346 28456 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:31.448443 28451 server_base.cc:1061] running on GCE node
W20260504 14:07:31.448346 28459 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:31.448947 28451 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:31.449465 28451 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:31.450614 28451 hybrid_clock.cc:648] HybridClock initialized: now 1777903651450591 us; error 39 us; skew 500 ppm
May 04 14:07:31 dist-test-slave-2x32 krb5kdc[28329](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903651, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:31.453356 28451 init.cc:377] Logged in from keytab as kudu/127.25.254.193@KRBTEST.COM (short username kudu)
I20260504 14:07:31.454593 28451 webserver.cc:492] Webserver started at http://127.25.254.193:45537/ using document root <none> and password file <none>
I20260504 14:07:31.455181 28451 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:31.455257 28451 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:31.455474 28451 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:31.457237 28451 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/data/instance:
uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef"
format_stamp: "Formatted at 2026-05-04 14:07:31 on dist-test-slave-2x32"
server_key: "c49d671030a6161737698eba843a9ca5"
server_key_iv: "b911437dda404aaae2c8fda39018fbc7"
server_key_version: "encryptionkey@0"
I20260504 14:07:31.457728 28451 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance:
uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef"
format_stamp: "Formatted at 2026-05-04 14:07:31 on dist-test-slave-2x32"
server_key: "c49d671030a6161737698eba843a9ca5"
server_key_iv: "b911437dda404aaae2c8fda39018fbc7"
server_key_version: "encryptionkey@0"
I20260504 14:07:31.461107 28451 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.006s	sys 0.000s
I20260504 14:07:31.463490 28466 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:31.464510 28451 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.001s	sys 0.000s
I20260504 14:07:31.464766 28451 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef"
format_stamp: "Formatted at 2026-05-04 14:07:31 on dist-test-slave-2x32"
server_key: "c49d671030a6161737698eba843a9ca5"
server_key_iv: "b911437dda404aaae2c8fda39018fbc7"
server_key_version: "encryptionkey@0"
I20260504 14:07:31.464874 28451 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:31.478408 28451 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:31.481585 28451 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:31.481845 28451 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:31.483422 28451 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:31.483505 28451 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:31.483567 28451 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:31.483618 28451 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
May 04 14:07:31 dist-test-slave-2x32 krb5kdc[28329](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903651, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:31.496066 28451 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:42015
I20260504 14:07:31.496124 28581 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:42015 every 8 connection(s)
I20260504 14:07:31.497136 28426 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:31.484149 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:39487 (local address 127.25.254.254:44627)
0504 14:07:31.484270 (+   121us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:31.484273 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:31.485115 (+   842us) server_negotiation.cc:408] Connection header received
0504 14:07:31.485977 (+   862us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:31.485982 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:31.486044 (+    62us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:31.486149 (+   105us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:31.487641 (+  1492us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.488521 (+   880us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:31.489566 (+  1045us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.489827 (+   261us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:31.492873 (+  3046us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:31.492899 (+    26us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:31.492902 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:31.492941 (+    39us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:31.494882 (+  1941us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:31.495407 (+   525us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:31.495412 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:31.495413 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:31.495479 (+    66us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:31.495941 (+   462us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:31.495945 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:31.495947 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:31.496186 (+   239us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:31.496294 (+   108us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:31.496892 (+   598us) server_negotiation.cc:300] Negotiation successful
0504 14:07:31.497012 (+   120us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":32}
I20260504 14:07:31.497287 28451 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
I20260504 14:07:31.498032 28475 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:31.484495 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:44627 (local address 127.25.254.193:39487)
0504 14:07:31.484969 (+   474us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:31.485003 (+    34us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:31.485775 (+   772us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:31.486348 (+   573us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:31.486357 (+     9us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:31.486980 (+   623us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:31.487446 (+   466us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:31.487458 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.488708 (+  1250us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:31.488712 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:31.489375 (+   663us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:31.489385 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.489626 (+   241us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:31.490414 (+   788us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:31.490445 (+    31us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:31.492688 (+  2243us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:31.495029 (+  2341us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:31.495034 (+     5us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:31.495044 (+    10us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:31.495286 (+   242us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:31.495701 (+   415us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:31.495705 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:31.495707 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:31.495824 (+   117us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:31.496295 (+   471us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:31.496302 (+     7us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:31.496608 (+   306us) client_negotiation.cc:770] Sending connection context
0504 14:07:31.496849 (+   241us) client_negotiation.cc:241] Negotiation successful
0504 14:07:31.497100 (+   251us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":305,"thread_start_us":150,"threads_started":1}
I20260504 14:07:31.499738 28582 heartbeater.cc:344] Connected to a master server at 127.25.254.254:44627
I20260504 14:07:31.499931 28582 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:31.500438 28582 heartbeater.cc:507] Master 127.25.254.254:44627 requested a full tablet report, sending...
I20260504 14:07:31.502359 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 28451
I20260504 14:07:31.502313 28375 ts_manager.cc:194] Registered new tserver with Master: d50b95c3eafb4e4ea0f3e4ec97a791ef (127.25.254.193:42015)
I20260504 14:07:31.502496 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance
I20260504 14:07:31.502772 26619 external_mini_cluster.cc:1468] Setting key eeb74d3a1a8c3c3d1d43a490ae10b68f
I20260504 14:07:31.504050 28375 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.193@KRBTEST.COM'} at 127.25.254.193:39487
I20260504 14:07:31.510003 28588 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:31.502217 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:44627 (local address 127.25.254.193:38833)
0504 14:07:31.502655 (+   438us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:31.502667 (+    12us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:31.502809 (+   142us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:31.503069 (+   260us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:31.503073 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:31.503316 (+   243us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:31.503594 (+   278us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:31.503601 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.504509 (+   908us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:31.504513 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:31.505066 (+   553us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:31.505075 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.505241 (+   166us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:31.505835 (+   594us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:31.505850 (+    15us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:31.506353 (+   503us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:31.508550 (+  2197us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:31.508553 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:31.508554 (+     1us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:31.508757 (+   203us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:31.509030 (+   273us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:31.509033 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:31.509035 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:31.509083 (+    48us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:31.509542 (+   459us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:31.509554 (+    12us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:31.509639 (+    85us) client_negotiation.cc:770] Sending connection context
0504 14:07:31.509744 (+   105us) client_negotiation.cc:241] Negotiation successful
0504 14:07:31.509856 (+   112us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":367,"thread_start_us":202,"threads_started":1}
I20260504 14:07:31.510243 28426 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:31.502282 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:38833 (local address 127.25.254.254:44627)
0504 14:07:31.502413 (+   131us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:31.502420 (+     7us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:31.502749 (+   329us) server_negotiation.cc:408] Connection header received
0504 14:07:31.502912 (+   163us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:31.502918 (+     6us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:31.502964 (+    46us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:31.503053 (+    89us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:31.503715 (+   662us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.504377 (+   662us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:31.505221 (+   844us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.505473 (+   252us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:31.506497 (+  1024us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:31.506514 (+    17us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:31.506517 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:31.506548 (+    31us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:31.508426 (+  1878us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:31.508849 (+   423us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:31.508855 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:31.508860 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:31.508923 (+    63us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:31.509181 (+   258us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:31.509188 (+     7us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:31.509191 (+     3us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:31.509398 (+   207us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:31.509509 (+   111us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:31.509840 (+   331us) server_negotiation.cc:300] Negotiation successful
0504 14:07:31.509980 (+   140us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":48}
WARNING: no policy specified for kudu/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:31.561892 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:0
--local_ip_for_outbound_sockets=127.25.254.194
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:44627
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:39763
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--enable_txn_system_client_init=true
--txn_keepalive_interval_ms=500 with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:31.670785 28593 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:31.671033 28593 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:31.671165 28593 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:31.675187 28593 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:31.675263 28593 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:31.675374 28593 flags.cc:432] Enabled experimental flag: --txn_keepalive_interval_ms=500
W20260504 14:07:31.675419 28593 flags.cc:432] Enabled experimental flag: --enable_txn_system_client_init=true
W20260504 14:07:31.675467 28593 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:07:31.680382 28593 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:39763
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=0
--webserver_require_spnego=true
--txn_keepalive_interval_ms=500
--enable_txn_system_client_init=true
--tserver_master_addrs=127.25.254.254:44627
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.28593
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:31.681504 28593 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:31.682461 28593 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:31.688966 28601 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:31.689083 28598 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:31.688966 28599 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:31.689386 28593 server_base.cc:1061] running on GCE node
I20260504 14:07:31.689765 28593 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:31.690382 28593 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:31.691570 28593 hybrid_clock.cc:648] HybridClock initialized: now 1777903651691535 us; error 51 us; skew 500 ppm
May 04 14:07:31 dist-test-slave-2x32 krb5kdc[28329](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903651, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:31.694527 28593 init.cc:377] Logged in from keytab as kudu/127.25.254.194@KRBTEST.COM (short username kudu)
I20260504 14:07:31.695791 28593 webserver.cc:492] Webserver started at http://127.25.254.194:45399/ using document root <none> and password file <none>
I20260504 14:07:31.696382 28593 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:31.696461 28593 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:31.696673 28593 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:31.698480 28593 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/data/instance:
uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53"
format_stamp: "Formatted at 2026-05-04 14:07:31 on dist-test-slave-2x32"
server_key: "a8916c18d74888f96168085a833101d0"
server_key_iv: "f4e73d5be5e22e5068323d84b7c487e6"
server_key_version: "encryptionkey@0"
I20260504 14:07:31.698979 28593 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance:
uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53"
format_stamp: "Formatted at 2026-05-04 14:07:31 on dist-test-slave-2x32"
server_key: "a8916c18d74888f96168085a833101d0"
server_key_iv: "f4e73d5be5e22e5068323d84b7c487e6"
server_key_version: "encryptionkey@0"
I20260504 14:07:31.702525 28593 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.003s	sys 0.000s
I20260504 14:07:31.704808 28608 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:31.705860 28593 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:07:31.705994 28593 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53"
format_stamp: "Formatted at 2026-05-04 14:07:31 on dist-test-slave-2x32"
server_key: "a8916c18d74888f96168085a833101d0"
server_key_iv: "f4e73d5be5e22e5068323d84b7c487e6"
server_key_version: "encryptionkey@0"
I20260504 14:07:31.706103 28593 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:31.714648 28593 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:31.717579 28593 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:31.717794 28593 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:31.719273 28593 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:31.719323 28593 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:31.719386 28593 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:31.719419 28593 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
May 04 14:07:31 dist-test-slave-2x32 krb5kdc[28329](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903651, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:31.730770 28593 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:35029
I20260504 14:07:31.730772 28723 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:35029 every 8 connection(s)
I20260504 14:07:31.732056 28593 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
I20260504 14:07:31.733448 28617 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:31.720355 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:44627 (local address 127.25.254.194:35759)
0504 14:07:31.720824 (+   469us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:31.720856 (+    32us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:31.721606 (+   750us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:31.722146 (+   540us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:31.722188 (+    42us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:31.722906 (+   718us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:31.723645 (+   739us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:31.723665 (+    20us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.724928 (+  1263us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:31.724934 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:31.725598 (+   664us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:31.725609 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.725840 (+   231us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:31.726574 (+   734us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:31.726602 (+    28us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:31.728673 (+  2071us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:31.730970 (+  2297us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:31.730976 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:31.730988 (+    12us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:31.731278 (+   290us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:31.731569 (+   291us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:31.731578 (+     9us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:31.731580 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:31.731695 (+   115us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:31.732047 (+   352us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:31.732053 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:31.732315 (+   262us) client_negotiation.cc:770] Sending connection context
0504 14:07:31.732550 (+   235us) client_negotiation.cc:241] Negotiation successful
0504 14:07:31.732845 (+   295us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":311,"thread_start_us":138,"threads_started":1}
I20260504 14:07:31.734090 28426 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:31.720000 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:35759 (local address 127.25.254.254:44627)
0504 14:07:31.720132 (+   132us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:31.720135 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:31.720951 (+   816us) server_negotiation.cc:408] Connection header received
0504 14:07:31.721810 (+   859us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:31.721814 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:31.721873 (+    59us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:31.721980 (+   107us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:31.723880 (+  1900us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.724724 (+   844us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:31.725753 (+  1029us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.725984 (+   231us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:31.728897 (+  2913us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:31.728931 (+    34us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:31.728936 (+     5us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:31.728975 (+    39us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:31.730815 (+  1840us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:31.731382 (+   567us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:31.731385 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:31.731386 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:31.731466 (+    80us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:31.731793 (+   327us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:31.731795 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:31.731797 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:31.731940 (+   143us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:31.732015 (+    75us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:31.733857 (+  1842us) server_negotiation.cc:300] Negotiation successful
0504 14:07:31.733978 (+   121us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":45}
I20260504 14:07:31.734985 28724 heartbeater.cc:344] Connected to a master server at 127.25.254.254:44627
I20260504 14:07:31.735141 28724 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:31.735728 28724 heartbeater.cc:507] Master 127.25.254.254:44627 requested a full tablet report, sending...
I20260504 14:07:31.737030 28375 ts_manager.cc:194] Registered new tserver with Master: bbbce19d6ac948a1ba8dfbc4e8aebe53 (127.25.254.194:35029)
I20260504 14:07:31.737846 28375 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.194@KRBTEST.COM'} at 127.25.254.194:35759
I20260504 14:07:31.737905 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 28593
I20260504 14:07:31.738027 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance
I20260504 14:07:31.738333 26619 external_mini_cluster.cc:1468] Setting key 82bb4632fd62a2d34b422270a91b2bfa
I20260504 14:07:31.746085 28426 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:31.737511 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:47571 (local address 127.25.254.254:44627)
0504 14:07:31.737659 (+   148us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:31.737663 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:31.737851 (+   188us) server_negotiation.cc:408] Connection header received
0504 14:07:31.738014 (+   163us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:31.738019 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:31.738074 (+    55us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:31.738130 (+    56us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:31.740380 (+  2250us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.740877 (+   497us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:31.741656 (+   779us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.741880 (+   224us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:31.742779 (+   899us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:31.742801 (+    22us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:31.742808 (+     7us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:31.742841 (+    33us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:31.744447 (+  1606us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:31.744969 (+   522us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:31.744972 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:31.744973 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:31.745021 (+    48us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:31.745285 (+   264us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:31.745287 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:31.745289 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:31.745479 (+   190us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:31.745559 (+    80us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:31.745858 (+   299us) server_negotiation.cc:300] Negotiation successful
0504 14:07:31.745962 (+   104us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":54}
I20260504 14:07:31.746229 28730 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:31.737445 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:44627 (local address 127.25.254.194:47571)
0504 14:07:31.737757 (+   312us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:31.737773 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:31.737879 (+   106us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:31.738298 (+   419us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:31.738301 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:31.739950 (+  1649us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:31.740244 (+   294us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:31.740256 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.740999 (+   743us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:31.741006 (+     7us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:31.741537 (+   531us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:31.741545 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.741637 (+    92us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:31.742265 (+   628us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:31.742279 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:31.742630 (+   351us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:31.744575 (+  1945us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:31.744582 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:31.744587 (+     5us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:31.744842 (+   255us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:31.745123 (+   281us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:31.745127 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:31.745129 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:31.745193 (+    64us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:31.745582 (+   389us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:31.745585 (+     3us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:31.745677 (+    92us) client_negotiation.cc:770] Sending connection context
0504 14:07:31.745802 (+   125us) client_negotiation.cc:241] Negotiation successful
0504 14:07:31.745948 (+   146us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":180,"thread_start_us":99,"threads_started":1}
WARNING: no policy specified for kudu/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:31.793820 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:0
--local_ip_for_outbound_sockets=127.25.254.195
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:44627
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:39763
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--enable_txn_system_client_init=true
--txn_keepalive_interval_ms=500 with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:31.917575 28735 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:31.917898 28735 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:31.918012 28735 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:31.923703 28735 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:31.923775 28735 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:31.923830 28735 flags.cc:432] Enabled experimental flag: --txn_keepalive_interval_ms=500
W20260504 14:07:31.923846 28735 flags.cc:432] Enabled experimental flag: --enable_txn_system_client_init=true
W20260504 14:07:31.923897 28735 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:07:31.928474 28735 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:39763
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=0
--webserver_require_spnego=true
--txn_keepalive_interval_ms=500
--enable_txn_system_client_init=true
--tserver_master_addrs=127.25.254.254:44627
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.28735
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:31.929539 28735 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:31.930413 28735 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:31.937649 28743 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:31.937740 28740 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:31.937706 28741 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:31.938011 28735 server_base.cc:1061] running on GCE node
I20260504 14:07:31.938478 28735 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:31.939025 28735 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:31.940224 28735 hybrid_clock.cc:648] HybridClock initialized: now 1777903651940192 us; error 44 us; skew 500 ppm
May 04 14:07:31 dist-test-slave-2x32 krb5kdc[28329](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903651, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:31.943128 28735 init.cc:377] Logged in from keytab as kudu/127.25.254.195@KRBTEST.COM (short username kudu)
I20260504 14:07:31.944296 28735 webserver.cc:492] Webserver started at http://127.25.254.195:34253/ using document root <none> and password file <none>
I20260504 14:07:31.944895 28735 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:31.944972 28735 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:31.945188 28735 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:31.947242 28735 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/data/instance:
uuid: "8bba0484174a4c3bab6791a586ccd1b6"
format_stamp: "Formatted at 2026-05-04 14:07:31 on dist-test-slave-2x32"
server_key: "6c2acefd5c9f92bf8394b7a6c4cd3e13"
server_key_iv: "e1f7945302fb1afd79cc3efba3bffbe4"
server_key_version: "encryptionkey@0"
I20260504 14:07:31.947762 28735 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance:
uuid: "8bba0484174a4c3bab6791a586ccd1b6"
format_stamp: "Formatted at 2026-05-04 14:07:31 on dist-test-slave-2x32"
server_key: "6c2acefd5c9f92bf8394b7a6c4cd3e13"
server_key_iv: "e1f7945302fb1afd79cc3efba3bffbe4"
server_key_version: "encryptionkey@0"
I20260504 14:07:31.951375 28735 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.004s	sys 0.000s
I20260504 14:07:31.953751 28750 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:31.954835 28735 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:07:31.954977 28735 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "8bba0484174a4c3bab6791a586ccd1b6"
format_stamp: "Formatted at 2026-05-04 14:07:31 on dist-test-slave-2x32"
server_key: "6c2acefd5c9f92bf8394b7a6c4cd3e13"
server_key_iv: "e1f7945302fb1afd79cc3efba3bffbe4"
server_key_version: "encryptionkey@0"
I20260504 14:07:31.955087 28735 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:31.964061 28735 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:31.967201 28735 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:31.967422 28735 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:31.968924 28735 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:31.968999 28735 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:31.969070 28735 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:31.969120 28735 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
May 04 14:07:31 dist-test-slave-2x32 krb5kdc[28329](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903651, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:31.981189 28735 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:37501
I20260504 14:07:31.982131 28735 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
I20260504 14:07:31.982393 28865 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:37501 every 8 connection(s)
I20260504 14:07:31.982733 28426 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:31.969820 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:53197 (local address 127.25.254.254:44627)
0504 14:07:31.969990 (+   170us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:31.969994 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:31.970846 (+   852us) server_negotiation.cc:408] Connection header received
0504 14:07:31.971884 (+  1038us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:31.971889 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:31.971957 (+    68us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:31.972054 (+    97us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:31.973747 (+  1693us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.974749 (+  1002us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:31.975495 (+   746us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.975690 (+   195us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:31.978768 (+  3078us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:31.978798 (+    30us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:31.978802 (+     4us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:31.978837 (+    35us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:31.980541 (+  1704us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:31.981153 (+   612us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:31.981156 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:31.981158 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:31.981212 (+    54us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:31.981514 (+   302us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:31.981517 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:31.981519 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:31.981682 (+   163us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:31.981789 (+   107us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:31.982377 (+   588us) server_negotiation.cc:300] Negotiation successful
0504 14:07:31.982503 (+   126us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":63}
I20260504 14:07:31.983109 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 28735
I20260504 14:07:31.983230 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance
I20260504 14:07:31.983417 28759 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:31.970252 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:44627 (local address 127.25.254.195:53197)
0504 14:07:31.970694 (+   442us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:31.970734 (+    40us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:31.971632 (+   898us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:31.972220 (+   588us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:31.972228 (+     8us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:31.972919 (+   691us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:31.973583 (+   664us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:31.973596 (+    13us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.974887 (+  1291us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:31.974891 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:31.975350 (+   459us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:31.975357 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.975599 (+   242us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:31.976267 (+   668us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:31.976293 (+    26us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:31.978454 (+  2161us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:31.980704 (+  2250us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:31.980713 (+     9us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:31.980726 (+    13us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:31.981038 (+   312us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:31.981306 (+   268us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:31.981309 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:31.981312 (+     3us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:31.981409 (+    97us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:31.981805 (+   396us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:31.981812 (+     7us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:31.982023 (+   211us) client_negotiation.cc:770] Sending connection context
0504 14:07:31.982324 (+   301us) client_negotiation.cc:241] Negotiation successful
0504 14:07:31.982625 (+   301us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":261,"thread_start_us":139,"threads_started":1}
I20260504 14:07:31.983507 26619 external_mini_cluster.cc:1468] Setting key 4600e4d776b5b895a9be9d8ceee71439
I20260504 14:07:31.984851 28866 heartbeater.cc:344] Connected to a master server at 127.25.254.254:44627
I20260504 14:07:31.985025 28866 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:31.985620 28866 heartbeater.cc:507] Master 127.25.254.254:44627 requested a full tablet report, sending...
I20260504 14:07:31.986723 28375 ts_manager.cc:194] Registered new tserver with Master: 8bba0484174a4c3bab6791a586ccd1b6 (127.25.254.195:37501)
I20260504 14:07:31.987412 28375 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.195@KRBTEST.COM'} at 127.25.254.195:53197
I20260504 14:07:31.993360 28426 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:31.987091 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:60635 (local address 127.25.254.254:44627)
0504 14:07:31.987702 (+   611us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:31.987706 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:31.987718 (+    12us) server_negotiation.cc:408] Connection header received
0504 14:07:31.987751 (+    33us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:31.987754 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:31.987794 (+    40us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:31.987881 (+    87us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:31.988528 (+   647us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.989018 (+   490us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:31.989631 (+   613us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.989774 (+   143us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:31.990579 (+   805us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:31.990594 (+    15us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:31.990596 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:31.990618 (+    22us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:31.991998 (+  1380us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:31.992341 (+   343us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:31.992343 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:31.992345 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:31.992383 (+    38us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:31.992677 (+   294us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:31.992679 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:31.992680 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:31.992805 (+   125us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:31.992878 (+    73us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:31.993160 (+   282us) server_negotiation.cc:300] Negotiation successful
0504 14:07:31.993246 (+    86us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":510}
I20260504 14:07:31.993474 28872 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:31.987009 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:44627 (local address 127.25.254.195:60635)
0504 14:07:31.987280 (+   271us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:31.987292 (+    12us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:31.987368 (+    76us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:31.987929 (+   561us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:31.987932 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:31.988137 (+   205us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:31.988404 (+   267us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:31.988413 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.989131 (+   718us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:31.989135 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:31.989528 (+   393us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:31.989535 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:31.989626 (+    91us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:31.990111 (+   485us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:31.990124 (+    13us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:31.990465 (+   341us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:31.992097 (+  1632us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:31.992100 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:31.992102 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:31.992255 (+   153us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:31.992499 (+   244us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:31.992502 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:31.992504 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:31.992555 (+    51us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:31.992923 (+   368us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:31.992926 (+     3us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:31.993012 (+    86us) client_negotiation.cc:770] Sending connection context
0504 14:07:31.993109 (+    97us) client_negotiation.cc:241] Negotiation successful
0504 14:07:31.993211 (+   102us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":191,"thread_start_us":102,"threads_started":1}
I20260504 14:07:31.997560 26619 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
May 04 14:07:32 dist-test-slave-2x32 krb5kdc[28329](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903652, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
May 04 14:07:32 dist-test-slave-2x32 krb5kdc[28329](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903652, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:32.022656 28426 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.014824 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:50012 (local address 127.25.254.254:44627)
0504 14:07:32.014981 (+   157us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:32.014984 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:32.014996 (+    12us) server_negotiation.cc:408] Connection header received
0504 14:07:32.015035 (+    39us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:32.015037 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:32.015079 (+    42us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:32.015170 (+    91us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:32.016008 (+   838us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.016503 (+   495us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.017215 (+   712us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.017435 (+   220us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.019644 (+  2209us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:32.019670 (+    26us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:32.019673 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:32.019703 (+    30us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:32.020989 (+  1286us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:32.021366 (+   377us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:32.021369 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:32.021370 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:32.021418 (+    48us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:32.021722 (+   304us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:32.021725 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:32.021726 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:32.021884 (+   158us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:32.022004 (+   120us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:32.022356 (+   352us) server_negotiation.cc:300] Negotiation successful
0504 14:07:32.022519 (+   163us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":58}
I20260504 14:07:32.025550 28375 catalog_manager.cc:2257] Servicing CreateTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:50012:
name: "test-table"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "val"
    type: INT32
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
W20260504 14:07:32.028115 28375 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-table in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260504 14:07:32.041172 28886 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.036856 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:37501 (local address 127.0.0.1:35538)
0504 14:07:32.037570 (+   714us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:32.037585 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:32.037703 (+   118us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:32.038485 (+   782us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:32.038491 (+     6us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:32.038529 (+    38us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:32.038852 (+   323us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.038859 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.039856 (+   997us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.039862 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:32.040683 (+   821us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.040691 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.040799 (+   108us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.040848 (+    49us) client_negotiation.cc:770] Sending connection context
0504 14:07:32.040933 (+    85us) client_negotiation.cc:241] Negotiation successful
0504 14:07:32.040998 (+    65us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":633,"spinlock_wait_cycles":7808,"thread_start_us":102,"threads_started":1}
I20260504 14:07:32.041453 28883 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.037229 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:35029 (local address 127.0.0.1:54334)
0504 14:07:32.037923 (+   694us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:32.037935 (+    12us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:32.038016 (+    81us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:32.038697 (+   681us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:32.038699 (+     2us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:32.038715 (+    16us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:32.038924 (+   209us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.038931 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.040265 (+  1334us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.040268 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:32.041067 (+   799us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.041075 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.041183 (+   108us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.041196 (+    13us) client_negotiation.cc:770] Sending connection context
0504 14:07:32.041234 (+    38us) client_negotiation.cc:241] Negotiation successful
0504 14:07:32.041315 (+    81us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":632,"thread_start_us":110,"threads_started":1}
I20260504 14:07:32.042323 28885 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.036584 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:42015 (local address 127.0.0.1:56020)
0504 14:07:32.037500 (+   916us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:32.037542 (+    42us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:32.037701 (+   159us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:32.038482 (+   781us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:32.038488 (+     6us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:32.038528 (+    40us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:32.038824 (+   296us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.038830 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.040225 (+  1395us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.040229 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:32.041082 (+   853us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.041089 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.041184 (+    95us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.041202 (+    18us) client_negotiation.cc:770] Sending connection context
0504 14:07:32.041251 (+    49us) client_negotiation.cc:241] Negotiation successful
0504 14:07:32.041305 (+    54us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":752,"spinlock_wait_cycles":7936,"thread_start_us":98,"threads_started":1}
I20260504 14:07:32.042423 28887 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.037347 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:35538 (local address 127.25.254.195:37501)
0504 14:07:32.038020 (+   673us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:32.038026 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:32.038044 (+    18us) server_negotiation.cc:408] Connection header received
0504 14:07:32.038114 (+    70us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:32.038118 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:32.038300 (+   182us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:32.038482 (+   182us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:32.038988 (+   506us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.039734 (+   746us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.041545 (+  1811us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.042065 (+   520us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.042139 (+    74us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:32.042221 (+    82us) server_negotiation.cc:300] Negotiation successful
0504 14:07:32.042296 (+    75us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":560,"thread_start_us":141,"threads_started":1}
I20260504 14:07:32.042435 28884 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.036721 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:56020 (local address 127.25.254.193:42015)
0504 14:07:32.037087 (+   366us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:32.037093 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:32.037840 (+   747us) server_negotiation.cc:408] Connection header received
0504 14:07:32.037938 (+    98us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:32.037943 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:32.038131 (+   188us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:32.038304 (+   173us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:32.038956 (+   652us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.040098 (+  1142us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.041420 (+  1322us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.041967 (+   547us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.042095 (+   128us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:32.042211 (+   116us) server_negotiation.cc:300] Negotiation successful
0504 14:07:32.042300 (+    89us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":266,"thread_start_us":85,"threads_started":1}
I20260504 14:07:32.042465 28888 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.037393 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:54334 (local address 127.25.254.194:35029)
0504 14:07:32.037781 (+   388us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:32.037788 (+     7us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:32.038103 (+   315us) server_negotiation.cc:408] Connection header received
0504 14:07:32.038220 (+   117us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:32.038228 (+     8us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:32.038395 (+   167us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:32.038529 (+   134us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:32.039043 (+   514us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.040135 (+  1092us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.041690 (+  1555us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.042047 (+   357us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.042121 (+    74us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:32.042226 (+   105us) server_negotiation.cc:300] Negotiation successful
0504 14:07:32.042298 (+    72us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":289,"thread_start_us":112,"threads_started":1}
I20260504 14:07:32.045095 28800 tablet_service.cc:1511] Processing CreateTablet for tablet 1edba6abea4143208a22fafd73fe0ae1 (DEFAULT_TABLE table=test-table [id=bc68e4af9cf944aa84c1cfdb4ea41da1]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:32.045120 28516 tablet_service.cc:1511] Processing CreateTablet for tablet 1edba6abea4143208a22fafd73fe0ae1 (DEFAULT_TABLE table=test-table [id=bc68e4af9cf944aa84c1cfdb4ea41da1]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:32.045120 28658 tablet_service.cc:1511] Processing CreateTablet for tablet 1edba6abea4143208a22fafd73fe0ae1 (DEFAULT_TABLE table=test-table [id=bc68e4af9cf944aa84c1cfdb4ea41da1]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:32.046249 28516 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 1edba6abea4143208a22fafd73fe0ae1. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:32.046250 28658 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 1edba6abea4143208a22fafd73fe0ae1. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:32.046264 28800 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 1edba6abea4143208a22fafd73fe0ae1. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:32.051831 28889 tablet_bootstrap.cc:492] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6: Bootstrap starting.
I20260504 14:07:32.053328 28890 tablet_bootstrap.cc:492] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef: Bootstrap starting.
I20260504 14:07:32.054476 28889 tablet_bootstrap.cc:654] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:32.054481 28891 tablet_bootstrap.cc:492] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53: Bootstrap starting.
I20260504 14:07:32.055418 28890 tablet_bootstrap.cc:654] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:32.055812 28889 log.cc:826] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:32.056173 28891 tablet_bootstrap.cc:654] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:32.056195 28890 log.cc:826] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:32.057363 28891 log.cc:826] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:32.057997 28890 tablet_bootstrap.cc:492] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef: No bootstrap required, opened a new log
I20260504 14:07:32.057996 28889 tablet_bootstrap.cc:492] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6: No bootstrap required, opened a new log
I20260504 14:07:32.058244 28889 ts_tablet_manager.cc:1403] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6: Time spent bootstrapping tablet: real 0.007s	user 0.006s	sys 0.000s
I20260504 14:07:32.058254 28890 ts_tablet_manager.cc:1403] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef: Time spent bootstrapping tablet: real 0.005s	user 0.005s	sys 0.000s
I20260504 14:07:32.059192 28891 tablet_bootstrap.cc:492] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53: No bootstrap required, opened a new log
I20260504 14:07:32.059544 28891 ts_tablet_manager.cc:1403] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53: Time spent bootstrapping tablet: real 0.005s	user 0.005s	sys 0.000s
I20260504 14:07:32.061709 28890 raft_consensus.cc:359] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } }
I20260504 14:07:32.061949 28890 raft_consensus.cc:385] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:32.062016 28890 raft_consensus.cc:740] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: d50b95c3eafb4e4ea0f3e4ec97a791ef, State: Initialized, Role: FOLLOWER
I20260504 14:07:32.062372 28889 raft_consensus.cc:359] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } }
I20260504 14:07:32.062582 28889 raft_consensus.cc:385] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:32.062639 28889 raft_consensus.cc:740] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 8bba0484174a4c3bab6791a586ccd1b6, State: Initialized, Role: FOLLOWER
I20260504 14:07:32.062523 28890 consensus_queue.cc:260] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } }
I20260504 14:07:32.062602 28891 raft_consensus.cc:359] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } }
I20260504 14:07:32.062769 28891 raft_consensus.cc:385] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:32.062826 28891 raft_consensus.cc:740] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: bbbce19d6ac948a1ba8dfbc4e8aebe53, State: Initialized, Role: FOLLOWER
I20260504 14:07:32.063064 28889 consensus_queue.cc:260] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } }
I20260504 14:07:32.063208 28891 consensus_queue.cc:260] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } }
I20260504 14:07:32.063511 28890 ts_tablet_manager.cc:1434] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef: Time spent starting tablet: real 0.005s	user 0.005s	sys 0.000s
I20260504 14:07:32.063892 28582 heartbeater.cc:499] Master 127.25.254.254:44627 was elected leader, sending a full tablet report...
I20260504 14:07:32.063943 28889 ts_tablet_manager.cc:1434] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6: Time spent starting tablet: real 0.006s	user 0.004s	sys 0.000s
I20260504 14:07:32.063943 28891 ts_tablet_manager.cc:1434] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53: Time spent starting tablet: real 0.004s	user 0.004s	sys 0.000s
I20260504 14:07:32.064294 28724 heartbeater.cc:499] Master 127.25.254.254:44627 was elected leader, sending a full tablet report...
I20260504 14:07:32.064503 28866 heartbeater.cc:499] Master 127.25.254.254:44627 was elected leader, sending a full tablet report...
I20260504 14:07:32.146252 28895 raft_consensus.cc:493] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:07:32.146489 28895 raft_consensus.cc:515] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } }
I20260504 14:07:32.147590 28895 leader_election.cc:290] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 8bba0484174a4c3bab6791a586ccd1b6 (127.25.254.195:37501), bbbce19d6ac948a1ba8dfbc4e8aebe53 (127.25.254.194:35029)
I20260504 14:07:32.151005 28898 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.147930 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:37501 (local address 127.25.254.193:43969)
0504 14:07:32.148333 (+   403us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:32.148346 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:32.148518 (+   172us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:32.148827 (+   309us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:32.148830 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:32.148849 (+    19us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:32.149100 (+   251us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.149106 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.149878 (+   772us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.149886 (+     8us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:32.150545 (+   659us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.150553 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.150711 (+   158us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.150729 (+    18us) client_negotiation.cc:770] Sending connection context
0504 14:07:32.150784 (+    55us) client_negotiation.cc:241] Negotiation successful
0504 14:07:32.150838 (+    54us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":318,"spinlock_wait_cycles":31488,"thread_start_us":148,"threads_started":1}
I20260504 14:07:32.151065 28899 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.147930 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:35029 (local address 127.25.254.193:41085)
0504 14:07:32.148340 (+   410us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:32.148352 (+    12us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:32.148523 (+   171us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:32.148920 (+   397us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:32.148924 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:32.148946 (+    22us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:32.149195 (+   249us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.149201 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.149948 (+   747us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.149952 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:32.150654 (+   702us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.150663 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.150792 (+   129us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.150807 (+    15us) client_negotiation.cc:770] Sending connection context
0504 14:07:32.150860 (+    53us) client_negotiation.cc:241] Negotiation successful
0504 14:07:32.150911 (+    51us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":340,"mutex_wait_us":25,"spinlock_wait_cycles":14464,"thread_start_us":128,"threads_started":1}
I20260504 14:07:32.151451 28887 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.148111 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:43969 (local address 127.25.254.195:37501)
0504 14:07:32.148232 (+   121us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:32.148236 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:32.148436 (+   200us) server_negotiation.cc:408] Connection header received
0504 14:07:32.148671 (+   235us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:32.148674 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:32.148718 (+    44us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:32.148804 (+    86us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:32.149227 (+   423us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.149737 (+   510us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.150672 (+   935us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.151165 (+   493us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.151204 (+    39us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:32.151260 (+    56us) server_negotiation.cc:300] Negotiation successful
0504 14:07:32.151322 (+    62us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":46}
I20260504 14:07:32.151556 28888 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.147991 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:41085 (local address 127.25.254.194:35029)
0504 14:07:32.148146 (+   155us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:32.148150 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:32.148448 (+   298us) server_negotiation.cc:408] Connection header received
0504 14:07:32.148642 (+   194us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:32.148647 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:32.148732 (+    85us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:32.148813 (+    81us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:32.149320 (+   507us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.149826 (+   506us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.150787 (+   961us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.151275 (+   488us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.151309 (+    34us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:32.151364 (+    55us) server_negotiation.cc:300] Negotiation successful
0504 14:07:32.151420 (+    56us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":66}
I20260504 14:07:32.152128 28820 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "1edba6abea4143208a22fafd73fe0ae1" candidate_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "8bba0484174a4c3bab6791a586ccd1b6" is_pre_election: true
I20260504 14:07:32.152194 28660 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "1edba6abea4143208a22fafd73fe0ae1" candidate_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" is_pre_election: true
I20260504 14:07:32.152459 28820 raft_consensus.cc:2468] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate d50b95c3eafb4e4ea0f3e4ec97a791ef in term 0.
I20260504 14:07:32.152468 28660 raft_consensus.cc:2468] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate d50b95c3eafb4e4ea0f3e4ec97a791ef in term 0.
I20260504 14:07:32.153024 28468 leader_election.cc:304] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: bbbce19d6ac948a1ba8dfbc4e8aebe53, d50b95c3eafb4e4ea0f3e4ec97a791ef; no voters: 
I20260504 14:07:32.153263 28895 raft_consensus.cc:2804] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:07:32.153333 28895 raft_consensus.cc:493] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:07:32.153376 28895 raft_consensus.cc:3060] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:32.154392 28895 raft_consensus.cc:515] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } }
I20260504 14:07:32.154762 28895 leader_election.cc:290] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [CANDIDATE]: Term 1 election: Requested vote from peers 8bba0484174a4c3bab6791a586ccd1b6 (127.25.254.195:37501), bbbce19d6ac948a1ba8dfbc4e8aebe53 (127.25.254.194:35029)
I20260504 14:07:32.155088 28820 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "1edba6abea4143208a22fafd73fe0ae1" candidate_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "8bba0484174a4c3bab6791a586ccd1b6"
I20260504 14:07:32.155140 28660 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "1edba6abea4143208a22fafd73fe0ae1" candidate_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53"
I20260504 14:07:32.155218 28820 raft_consensus.cc:3060] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:32.155296 28660 raft_consensus.cc:3060] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:32.156327 28820 raft_consensus.cc:2468] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate d50b95c3eafb4e4ea0f3e4ec97a791ef in term 1.
I20260504 14:07:32.156335 28660 raft_consensus.cc:2468] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate d50b95c3eafb4e4ea0f3e4ec97a791ef in term 1.
I20260504 14:07:32.156633 28468 leader_election.cc:304] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: bbbce19d6ac948a1ba8dfbc4e8aebe53, d50b95c3eafb4e4ea0f3e4ec97a791ef; no voters: 
I20260504 14:07:32.156824 28895 raft_consensus.cc:2804] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:32.157099 28895 raft_consensus.cc:697] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 1 LEADER]: Becoming Leader. State: Replica: d50b95c3eafb4e4ea0f3e4ec97a791ef, State: Running, Role: LEADER
I20260504 14:07:32.157410 28895 consensus_queue.cc:237] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } }
I20260504 14:07:32.160996 28375 catalog_manager.cc:5671] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef reported cstate change: term changed from 0 to 1, leader changed from <none> to d50b95c3eafb4e4ea0f3e4ec97a791ef (127.25.254.193). New cstate: current_term: 1 leader_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } health_report { overall_health: UNKNOWN } } }
May 04 14:07:32 dist-test-slave-2x32 krb5kdc[28329](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903651, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:32.220592 28907 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.212194 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:44627 (local address 127.0.0.1:50026)
0504 14:07:32.212489 (+   295us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:32.212503 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:32.212616 (+   113us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:32.212882 (+   266us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:32.212884 (+     2us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:32.213513 (+   629us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:32.213719 (+   206us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.213725 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.214688 (+   963us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.214692 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:32.215073 (+   381us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.215079 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.215191 (+   112us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.215890 (+   699us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:32.215904 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:32.217417 (+  1513us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:32.219171 (+  1754us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:32.219180 (+     9us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:32.219190 (+    10us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:32.219455 (+   265us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:32.219697 (+   242us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:32.219700 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:32.219701 (+     1us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:32.219753 (+    52us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:32.220067 (+   314us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:32.220072 (+     5us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:32.220168 (+    96us) client_negotiation.cc:770] Sending connection context
0504 14:07:32.220271 (+   103us) client_negotiation.cc:241] Negotiation successful
0504 14:07:32.220394 (+   123us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":205,"thread_start_us":123,"threads_started":1}
I20260504 14:07:32.220611 28426 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.212247 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:50026 (local address 127.25.254.254:44627)
0504 14:07:32.212396 (+   149us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:32.212401 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:32.212620 (+   219us) server_negotiation.cc:408] Connection header received
0504 14:07:32.212699 (+    79us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:32.212702 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:32.212745 (+    43us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:32.212824 (+    79us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:32.213855 (+  1031us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.214539 (+   684us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.215185 (+   646us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.215314 (+   129us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.217537 (+  2223us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:32.217557 (+    20us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:32.217559 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:32.217580 (+    21us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:32.219057 (+  1477us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:32.219546 (+   489us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:32.219549 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:32.219550 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:32.219589 (+    39us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:32.219835 (+   246us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:32.219837 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:32.219838 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:32.219973 (+   135us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:32.220046 (+    73us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:32.220312 (+   266us) server_negotiation.cc:300] Negotiation successful
0504 14:07:32.220481 (+   169us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"mutex_wait_us":60,"server-negotiator.queue_time_us":59}
I20260504 14:07:32.224865 28375 catalog_manager.cc:2257] Servicing CreateTable request from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:50026:
name: "kudu_system.kudu_transactions"
schema {
  columns {
    name: "txn_id"
    type: INT64
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "entry_type"
    type: INT8
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "identifier"
    type: STRING
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "metadata"
    type: STRING
    is_key: false
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
  rows: "<redacted>""\006\001\000\000\000\000\000\000\000\000\007\001@B\017\000\000\000\000\000"
  indirect_data: "<redacted>"""
}
partition_schema {
  range_schema {
    columns {
      name: "txn_id"
    }
  }
}
table_type: TXN_STATUS_TABLE
W20260504 14:07:32.225633 28375 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table kudu_system.kudu_transactions in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260504 14:07:32.231542 28516 tablet_service.cc:1511] Processing CreateTablet for tablet b209c95a32ab42489bb787e250ca60e6 (TXN_STATUS_TABLE table=kudu_system.kudu_transactions [id=fe0a11841648427dbf1912dc9f988d8b]), partition=RANGE (txn_id) PARTITION 0 <= VALUES < 1000000
I20260504 14:07:32.231613 28658 tablet_service.cc:1511] Processing CreateTablet for tablet b209c95a32ab42489bb787e250ca60e6 (TXN_STATUS_TABLE table=kudu_system.kudu_transactions [id=fe0a11841648427dbf1912dc9f988d8b]), partition=RANGE (txn_id) PARTITION 0 <= VALUES < 1000000
I20260504 14:07:32.232280 28658 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet b209c95a32ab42489bb787e250ca60e6. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:32.232281 28516 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet b209c95a32ab42489bb787e250ca60e6. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:32.231948 28800 tablet_service.cc:1511] Processing CreateTablet for tablet b209c95a32ab42489bb787e250ca60e6 (TXN_STATUS_TABLE table=kudu_system.kudu_transactions [id=fe0a11841648427dbf1912dc9f988d8b]), partition=RANGE (txn_id) PARTITION 0 <= VALUES < 1000000
I20260504 14:07:32.232604 28800 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet b209c95a32ab42489bb787e250ca60e6. 1 dirs total, 0 dirs full, 0 dirs failed
W20260504 14:07:32.234200 28725 tablet.cc:2404] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20260504 14:07:32.235138 28891 tablet_bootstrap.cc:492] T b209c95a32ab42489bb787e250ca60e6 P bbbce19d6ac948a1ba8dfbc4e8aebe53: Bootstrap starting.
I20260504 14:07:32.235188 28889 tablet_bootstrap.cc:492] T b209c95a32ab42489bb787e250ca60e6 P 8bba0484174a4c3bab6791a586ccd1b6: Bootstrap starting.
W20260504 14:07:32.235358 28867 tablet.cc:2404] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20260504 14:07:32.236191 28890 tablet_bootstrap.cc:492] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef: Bootstrap starting.
I20260504 14:07:32.236248 28889 tablet_bootstrap.cc:654] T b209c95a32ab42489bb787e250ca60e6 P 8bba0484174a4c3bab6791a586ccd1b6: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:32.236274 28891 tablet_bootstrap.cc:654] T b209c95a32ab42489bb787e250ca60e6 P bbbce19d6ac948a1ba8dfbc4e8aebe53: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:32.237362 28890 tablet_bootstrap.cc:654] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:32.237644 28891 tablet_bootstrap.cc:492] T b209c95a32ab42489bb787e250ca60e6 P bbbce19d6ac948a1ba8dfbc4e8aebe53: No bootstrap required, opened a new log
I20260504 14:07:32.237743 28889 tablet_bootstrap.cc:492] T b209c95a32ab42489bb787e250ca60e6 P 8bba0484174a4c3bab6791a586ccd1b6: No bootstrap required, opened a new log
I20260504 14:07:32.237766 28891 ts_tablet_manager.cc:1403] T b209c95a32ab42489bb787e250ca60e6 P bbbce19d6ac948a1ba8dfbc4e8aebe53: Time spent bootstrapping tablet: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:07:32.237825 28889 ts_tablet_manager.cc:1403] T b209c95a32ab42489bb787e250ca60e6 P 8bba0484174a4c3bab6791a586ccd1b6: Time spent bootstrapping tablet: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:07:32.238358 28891 raft_consensus.cc:359] T b209c95a32ab42489bb787e250ca60e6 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:32.238478 28891 raft_consensus.cc:385] T b209c95a32ab42489bb787e250ca60e6 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:32.238521 28891 raft_consensus.cc:740] T b209c95a32ab42489bb787e250ca60e6 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: bbbce19d6ac948a1ba8dfbc4e8aebe53, State: Initialized, Role: FOLLOWER
I20260504 14:07:32.238502 28889 raft_consensus.cc:359] T b209c95a32ab42489bb787e250ca60e6 P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:32.238638 28889 raft_consensus.cc:385] T b209c95a32ab42489bb787e250ca60e6 P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:32.238679 28889 raft_consensus.cc:740] T b209c95a32ab42489bb787e250ca60e6 P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 8bba0484174a4c3bab6791a586ccd1b6, State: Initialized, Role: FOLLOWER
I20260504 14:07:32.238659 28891 consensus_queue.cc:260] T b209c95a32ab42489bb787e250ca60e6 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:32.238811 28889 consensus_queue.cc:260] T b209c95a32ab42489bb787e250ca60e6 P 8bba0484174a4c3bab6791a586ccd1b6 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:32.238873 28890 tablet_bootstrap.cc:492] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef: No bootstrap required, opened a new log
I20260504 14:07:32.238946 28890 ts_tablet_manager.cc:1403] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef: Time spent bootstrapping tablet: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:07:32.239014 28891 ts_tablet_manager.cc:1434] T b209c95a32ab42489bb787e250ca60e6 P bbbce19d6ac948a1ba8dfbc4e8aebe53: Time spent starting tablet: real 0.001s	user 0.002s	sys 0.000s
I20260504 14:07:32.239006 28897 tablet_replica.cc:442] T b209c95a32ab42489bb787e250ca60e6 P bbbce19d6ac948a1ba8dfbc4e8aebe53: TxnStatusTablet state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 0 committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } } }
I20260504 14:07:32.239078 28889 ts_tablet_manager.cc:1434] T b209c95a32ab42489bb787e250ca60e6 P 8bba0484174a4c3bab6791a586ccd1b6: Time spent starting tablet: real 0.001s	user 0.002s	sys 0.000s
I20260504 14:07:32.239130 28897 tablet_replica.cc:445] T b209c95a32ab42489bb787e250ca60e6 P bbbce19d6ac948a1ba8dfbc4e8aebe53: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:07:32.239288 28896 tablet_replica.cc:442] T b209c95a32ab42489bb787e250ca60e6 P 8bba0484174a4c3bab6791a586ccd1b6: TxnStatusTablet state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 0 committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } } }
I20260504 14:07:32.239413 28896 tablet_replica.cc:445] T b209c95a32ab42489bb787e250ca60e6 P 8bba0484174a4c3bab6791a586ccd1b6: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:07:32.239434 28890 raft_consensus.cc:359] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:32.239526 28890 raft_consensus.cc:385] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:32.239555 28890 raft_consensus.cc:740] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: d50b95c3eafb4e4ea0f3e4ec97a791ef, State: Initialized, Role: FOLLOWER
I20260504 14:07:32.239734 28890 consensus_queue.cc:260] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:32.240036 28890 ts_tablet_manager.cc:1434] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:07:32.240005 28895 tablet_replica.cc:442] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef: TxnStatusTablet state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 0 committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } } }
I20260504 14:07:32.241226 28895 tablet_replica.cc:445] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef: This TxnStatusTablet replica's current role is: FOLLOWER
W20260504 14:07:32.248729 28583 tablet.cc:2404] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20260504 14:07:32.286082 28895 raft_consensus.cc:493] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:07:32.286276 28895 raft_consensus.cc:515] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:32.286815 28895 leader_election.cc:290] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers bbbce19d6ac948a1ba8dfbc4e8aebe53 (127.25.254.194:35029), 8bba0484174a4c3bab6791a586ccd1b6 (127.25.254.195:37501)
I20260504 14:07:32.287338 28820 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "b209c95a32ab42489bb787e250ca60e6" candidate_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "8bba0484174a4c3bab6791a586ccd1b6" is_pre_election: true
I20260504 14:07:32.287283 28660 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "b209c95a32ab42489bb787e250ca60e6" candidate_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" is_pre_election: true
I20260504 14:07:32.287489 28820 raft_consensus.cc:2468] T b209c95a32ab42489bb787e250ca60e6 P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate d50b95c3eafb4e4ea0f3e4ec97a791ef in term 0.
I20260504 14:07:32.287489 28660 raft_consensus.cc:2468] T b209c95a32ab42489bb787e250ca60e6 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate d50b95c3eafb4e4ea0f3e4ec97a791ef in term 0.
I20260504 14:07:32.287827 28468 leader_election.cc:304] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: bbbce19d6ac948a1ba8dfbc4e8aebe53, d50b95c3eafb4e4ea0f3e4ec97a791ef; no voters: 
I20260504 14:07:32.288020 28895 raft_consensus.cc:2804] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:07:32.288079 28895 raft_consensus.cc:493] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:07:32.288107 28895 raft_consensus.cc:3060] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:32.289055 28895 raft_consensus.cc:515] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:32.289533 28895 leader_election.cc:290] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef [CANDIDATE]: Term 1 election: Requested vote from peers bbbce19d6ac948a1ba8dfbc4e8aebe53 (127.25.254.194:35029), 8bba0484174a4c3bab6791a586ccd1b6 (127.25.254.195:37501)
I20260504 14:07:32.289858 28660 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "b209c95a32ab42489bb787e250ca60e6" candidate_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53"
I20260504 14:07:32.289985 28660 raft_consensus.cc:3060] T b209c95a32ab42489bb787e250ca60e6 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:32.290069 28820 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "b209c95a32ab42489bb787e250ca60e6" candidate_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "8bba0484174a4c3bab6791a586ccd1b6"
I20260504 14:07:32.290267 28820 raft_consensus.cc:3060] T b209c95a32ab42489bb787e250ca60e6 P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:32.290983 28660 raft_consensus.cc:2468] T b209c95a32ab42489bb787e250ca60e6 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate d50b95c3eafb4e4ea0f3e4ec97a791ef in term 1.
I20260504 14:07:32.291240 28820 raft_consensus.cc:2468] T b209c95a32ab42489bb787e250ca60e6 P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate d50b95c3eafb4e4ea0f3e4ec97a791ef in term 1.
I20260504 14:07:32.291352 28468 leader_election.cc:304] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: bbbce19d6ac948a1ba8dfbc4e8aebe53, d50b95c3eafb4e4ea0f3e4ec97a791ef; no voters: 
I20260504 14:07:32.291570 28895 raft_consensus.cc:2804] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:32.291671 28895 raft_consensus.cc:697] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 1 LEADER]: Becoming Leader. State: Replica: d50b95c3eafb4e4ea0f3e4ec97a791ef, State: Running, Role: LEADER
I20260504 14:07:32.291822 28895 consensus_queue.cc:237] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:32.292583 28900 tablet_replica.cc:442] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef: TxnStatusTablet state changed. Reason: New leader d50b95c3eafb4e4ea0f3e4ec97a791ef. Latest consensus state: current_term: 1 leader_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } } }
I20260504 14:07:32.292695 28900 tablet_replica.cc:445] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef: This TxnStatusTablet replica's current role is: LEADER
I20260504 14:07:32.293138 28912 txn_status_manager.cc:874] Waiting until node catch up with all replicated operations in previous term...
I20260504 14:07:32.293357 28912 txn_status_manager.cc:930] Loading transaction status metadata into memory...
I20260504 14:07:32.293785 28375 catalog_manager.cc:5671] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef reported cstate change: term changed from 0 to 1, leader changed from <none> to d50b95c3eafb4e4ea0f3e4ec97a791ef (127.25.254.193). New cstate: current_term: 1 leader_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } health_report { overall_health: UNKNOWN } } }
I20260504 14:07:32.306404 28884 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.303041 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:56026 (local address 127.25.254.193:42015)
0504 14:07:32.303177 (+   136us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:32.303180 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:32.303193 (+    13us) server_negotiation.cc:408] Connection header received
0504 14:07:32.303428 (+   235us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:32.303431 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:32.303508 (+    77us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:32.303666 (+   158us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:07:32.304009 (+   343us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.304519 (+   510us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.305249 (+   730us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.305475 (+   226us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.305523 (+    48us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:07:32.305941 (+   418us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:07:32.306060 (+   119us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:32.306197 (+   137us) server_negotiation.cc:300] Negotiation successful
0504 14:07:32.306243 (+    46us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":57}
I20260504 14:07:32.306403 28907 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.302924 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:42015 (local address 127.0.0.1:56026)
0504 14:07:32.303088 (+   164us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:32.303106 (+    18us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:32.303224 (+   118us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:32.303675 (+   451us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:32.303679 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:32.303688 (+     9us) client_negotiation.cc:190] Negotiated authn=TOKEN
0504 14:07:32.303892 (+   204us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.303897 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.304644 (+   747us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.304648 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:32.305075 (+   427us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.305081 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.305264 (+   183us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.305310 (+    46us) client_negotiation.cc:253] Sending TOKEN_EXCHANGE NegotiatePB request
0504 14:07:32.306048 (+   738us) client_negotiation.cc:272] Received TOKEN_EXCHANGE NegotiatePB response
0504 14:07:32.306061 (+    13us) client_negotiation.cc:770] Sending connection context
0504 14:07:32.306168 (+   107us) client_negotiation.cc:241] Negotiation successful
0504 14:07:32.306265 (+    97us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":37}
I20260504 14:07:32.312206 28660 raft_consensus.cc:1275] T b209c95a32ab42489bb787e250ca60e6 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 1 FOLLOWER]: Refusing update from remote peer d50b95c3eafb4e4ea0f3e4ec97a791ef: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:07:32.312250 28820 raft_consensus.cc:1275] T b209c95a32ab42489bb787e250ca60e6 P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 FOLLOWER]: Refusing update from remote peer d50b95c3eafb4e4ea0f3e4ec97a791ef: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:07:32.312889 28900 consensus_queue.cc:1048] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef [LEADER]: Connected to new peer: Peer: permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:07:32.313062 28895 consensus_queue.cc:1048] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef [LEADER]: Connected to new peer: Peer: permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:07:32.317255 28896 tablet_replica.cc:442] T b209c95a32ab42489bb787e250ca60e6 P 8bba0484174a4c3bab6791a586ccd1b6: TxnStatusTablet state changed. Reason: New leader d50b95c3eafb4e4ea0f3e4ec97a791ef. Latest consensus state: current_term: 1 leader_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } } }
I20260504 14:07:32.317394 28896 tablet_replica.cc:445] T b209c95a32ab42489bb787e250ca60e6 P 8bba0484174a4c3bab6791a586ccd1b6: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:07:32.317761 28897 tablet_replica.cc:442] T b209c95a32ab42489bb787e250ca60e6 P bbbce19d6ac948a1ba8dfbc4e8aebe53: TxnStatusTablet state changed. Reason: New leader d50b95c3eafb4e4ea0f3e4ec97a791ef. Latest consensus state: current_term: 1 leader_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } } }
I20260504 14:07:32.317885 28897 tablet_replica.cc:445] T b209c95a32ab42489bb787e250ca60e6 P bbbce19d6ac948a1ba8dfbc4e8aebe53: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:07:32.319550 28900 tablet_replica.cc:442] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef: TxnStatusTablet state changed. Reason: Peer health change. Latest consensus state: current_term: 1 leader_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } } }
I20260504 14:07:32.319664 28900 tablet_replica.cc:445] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef: This TxnStatusTablet replica's current role is: LEADER
I20260504 14:07:32.320057 28897 tablet_replica.cc:442] T b209c95a32ab42489bb787e250ca60e6 P bbbce19d6ac948a1ba8dfbc4e8aebe53: TxnStatusTablet state changed. Reason: Replicated consensus-only round. Latest consensus state: current_term: 1 leader_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } } }
I20260504 14:07:32.320147 28897 tablet_replica.cc:445] T b209c95a32ab42489bb787e250ca60e6 P bbbce19d6ac948a1ba8dfbc4e8aebe53: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:07:32.320582 28895 tablet_replica.cc:442] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef: TxnStatusTablet state changed. Reason: Peer health change. Latest consensus state: current_term: 1 leader_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } } }
I20260504 14:07:32.320693 28895 tablet_replica.cc:445] T b209c95a32ab42489bb787e250ca60e6 P d50b95c3eafb4e4ea0f3e4ec97a791ef: This TxnStatusTablet replica's current role is: LEADER
I20260504 14:07:32.321731 28896 tablet_replica.cc:442] T b209c95a32ab42489bb787e250ca60e6 P 8bba0484174a4c3bab6791a586ccd1b6: TxnStatusTablet state changed. Reason: Replicated consensus-only round. Latest consensus state: current_term: 1 leader_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } } }
I20260504 14:07:32.321841 28896 tablet_replica.cc:445] T b209c95a32ab42489bb787e250ca60e6 P 8bba0484174a4c3bab6791a586ccd1b6: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:07:32.322293 28916 mvcc.cc:204] Tried to move back new op lower bound from 7282293359862931456 to 7282293359788511232. Current Snapshot: MvccSnapshot[applied={T|T < 7282293359862931456}]
I20260504 14:07:32.326258 28917 mvcc.cc:204] Tried to move back new op lower bound from 7282293359862931456 to 7282293359788511232. Current Snapshot: MvccSnapshot[applied={T|T < 7282293359862931456}]
I20260504 14:07:32.336887 28884 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.334184 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:56032 (local address 127.25.254.193:42015)
0504 14:07:32.334342 (+   158us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:32.334346 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:32.334360 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:07:32.334471 (+   111us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:32.334481 (+    10us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:32.334526 (+    45us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:32.334608 (+    82us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:07:32.334948 (+   340us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.335435 (+   487us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.336195 (+   760us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.336380 (+   185us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.336424 (+    44us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:07:32.336493 (+    69us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:07:32.336579 (+    86us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:32.336706 (+   127us) server_negotiation.cc:300] Negotiation successful
0504 14:07:32.336748 (+    42us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":60}
I20260504 14:07:32.347466 28884 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.344683 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:47467 (local address 127.25.254.193:42015)
0504 14:07:32.344808 (+   125us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:32.344812 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:32.344936 (+   124us) server_negotiation.cc:408] Connection header received
0504 14:07:32.345092 (+   156us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:32.345096 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:32.345144 (+    48us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:32.345229 (+    85us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:07:32.345583 (+   354us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.346044 (+   461us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.346783 (+   739us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.346959 (+   176us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.347002 (+    43us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:07:32.347067 (+    65us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:07:32.347151 (+    84us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:32.347270 (+   119us) server_negotiation.cc:300] Negotiation successful
0504 14:07:32.347303 (+    33us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":61}
I20260504 14:07:32.347687 28926 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.344509 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:42015 (local address 127.25.254.193:47467)
0504 14:07:32.344852 (+   343us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:32.344865 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:32.344995 (+   130us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:32.345236 (+   241us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:32.345239 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:32.345246 (+     7us) client_negotiation.cc:190] Negotiated authn=TOKEN
0504 14:07:32.345478 (+   232us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.345484 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.346171 (+   687us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.346175 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:32.346660 (+   485us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.346667 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.346802 (+   135us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.346843 (+    41us) client_negotiation.cc:253] Sending TOKEN_EXCHANGE NegotiatePB request
0504 14:07:32.347154 (+   311us) client_negotiation.cc:272] Received TOKEN_EXCHANGE NegotiatePB response
0504 14:07:32.347164 (+    10us) client_negotiation.cc:770] Sending connection context
0504 14:07:32.347253 (+    89us) client_negotiation.cc:241] Negotiation successful
0504 14:07:32.347308 (+    55us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":205,"thread_start_us":118,"threads_started":1}
I20260504 14:07:32.354672 28660 raft_consensus.cc:1275] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 1 FOLLOWER]: Refusing update from remote peer d50b95c3eafb4e4ea0f3e4ec97a791ef: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:07:32.354667 28820 raft_consensus.cc:1275] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 FOLLOWER]: Refusing update from remote peer d50b95c3eafb4e4ea0f3e4ec97a791ef: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:07:32.355155 28900 consensus_queue.cc:1048] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [LEADER]: Connected to new peer: Peer: permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:07:32.355332 28895 consensus_queue.cc:1048] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [LEADER]: Connected to new peer: Peer: permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:07:32.401394 28376 catalog_manager.cc:2507] Servicing SoftDeleteTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:50012:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:07:32.401587 28376 catalog_manager.cc:2755] Servicing DeleteTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:50012:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:07:32.404165 28376 catalog_manager.cc:5958] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991: Sending DeleteTablet for 3 replicas of tablet 1edba6abea4143208a22fafd73fe0ae1
I20260504 14:07:32.404874 28515 tablet_service.cc:1558] Processing DeleteTablet for tablet 1edba6abea4143208a22fafd73fe0ae1 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:32 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:56020
I20260504 14:07:32.404987 28658 tablet_service.cc:1558] Processing DeleteTablet for tablet 1edba6abea4143208a22fafd73fe0ae1 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:32 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:54334
I20260504 14:07:32.404981 28800 tablet_service.cc:1558] Processing DeleteTablet for tablet 1edba6abea4143208a22fafd73fe0ae1 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:32 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:35538
I20260504 14:07:32.405553 28933 tablet_replica.cc:333] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6: stopping tablet replica
I20260504 14:07:32.405716 28932 tablet_replica.cc:333] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53: stopping tablet replica
I20260504 14:07:32.405833 28933 raft_consensus.cc:2243] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:07:32.406075 28933 raft_consensus.cc:2272] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:07:32.406231 28932 raft_consensus.cc:2243] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:07:32.406373 28375 catalog_manager.cc:2257] Servicing CreateTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:50012:
name: "test-table"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "val"
    type: INT32
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
I20260504 14:07:32.406793 28932 raft_consensus.cc:2272] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 1 FOLLOWER]: Raft consensus is shut down!
W20260504 14:07:32.406845 28375 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-table in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260504 14:07:32.407686 28933 ts_tablet_manager.cc:1916] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:07:32.408244 28932 ts_tablet_manager.cc:1916] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:07:32.410641 28933 ts_tablet_manager.cc:1929] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.5
I20260504 14:07:32.410733 28933 log.cc:1199] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/wal/wals/1edba6abea4143208a22fafd73fe0ae1
I20260504 14:07:32.410974 28931 tablet_replica.cc:333] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef: stopping tablet replica
I20260504 14:07:32.411036 28933 ts_tablet_manager.cc:1950] T 1edba6abea4143208a22fafd73fe0ae1 P 8bba0484174a4c3bab6791a586ccd1b6: Deleting consensus metadata
I20260504 14:07:32.411325 28931 raft_consensus.cc:2243] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 1 LEADER]: Raft consensus shutting down.
I20260504 14:07:32.411733 28931 raft_consensus.cc:2272] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:07:32.412187 28932 ts_tablet_manager.cc:1929] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.5
I20260504 14:07:32.412307 28932 log.cc:1199] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/wal/wals/1edba6abea4143208a22fafd73fe0ae1
I20260504 14:07:32.412652 28932 ts_tablet_manager.cc:1950] T 1edba6abea4143208a22fafd73fe0ae1 P bbbce19d6ac948a1ba8dfbc4e8aebe53: Deleting consensus metadata
I20260504 14:07:32.412964 28658 tablet_service.cc:1511] Processing CreateTablet for tablet fa36ddefc9054c8c94d5ec702e0eccda (DEFAULT_TABLE table=test-table [id=5c9d6f7e10fd4d94bf4700d912ec655d]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:32.413050 28361 catalog_manager.cc:5002] TS 8bba0484174a4c3bab6791a586ccd1b6 (127.25.254.195:37501): tablet 1edba6abea4143208a22fafd73fe0ae1 (table test-table [id=bc68e4af9cf944aa84c1cfdb4ea41da1]) successfully deleted
I20260504 14:07:32.413233 28658 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet fa36ddefc9054c8c94d5ec702e0eccda. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:32.413867 28515 tablet_service.cc:1511] Processing CreateTablet for tablet fa36ddefc9054c8c94d5ec702e0eccda (DEFAULT_TABLE table=test-table [id=5c9d6f7e10fd4d94bf4700d912ec655d]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:32.414194 28515 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet fa36ddefc9054c8c94d5ec702e0eccda. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:32.415020 28931 ts_tablet_manager.cc:1916] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:07:32.416662 28890 tablet_bootstrap.cc:492] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef: Bootstrap starting.
I20260504 14:07:32.417085 28800 tablet_service.cc:1511] Processing CreateTablet for tablet fa36ddefc9054c8c94d5ec702e0eccda (DEFAULT_TABLE table=test-table [id=5c9d6f7e10fd4d94bf4700d912ec655d]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:32.417333 28800 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet fa36ddefc9054c8c94d5ec702e0eccda. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:32.417752 28890 tablet_bootstrap.cc:654] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:32.418671 28362 catalog_manager.cc:5002] TS bbbce19d6ac948a1ba8dfbc4e8aebe53 (127.25.254.194:35029): tablet 1edba6abea4143208a22fafd73fe0ae1 (table test-table [id=bc68e4af9cf944aa84c1cfdb4ea41da1]) successfully deleted
I20260504 14:07:32.418926 28931 ts_tablet_manager.cc:1929] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.5
I20260504 14:07:32.419008 28931 log.cc:1199] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/wal/wals/1edba6abea4143208a22fafd73fe0ae1
I20260504 14:07:32.419116 28890 tablet_bootstrap.cc:492] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef: No bootstrap required, opened a new log
I20260504 14:07:32.419199 28890 ts_tablet_manager.cc:1403] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef: Time spent bootstrapping tablet: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:07:32.419314 28931 ts_tablet_manager.cc:1950] T 1edba6abea4143208a22fafd73fe0ae1 P d50b95c3eafb4e4ea0f3e4ec97a791ef: Deleting consensus metadata
I20260504 14:07:32.419694 28890 raft_consensus.cc:359] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:32.419804 28890 raft_consensus.cc:385] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:32.419838 28890 raft_consensus.cc:740] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: d50b95c3eafb4e4ea0f3e4ec97a791ef, State: Initialized, Role: FOLLOWER
I20260504 14:07:32.419966 28890 consensus_queue.cc:260] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:32.420229 28890 ts_tablet_manager.cc:1434] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.000s
I20260504 14:07:32.420975 28889 tablet_bootstrap.cc:492] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6: Bootstrap starting.
I20260504 14:07:32.421860 28889 tablet_bootstrap.cc:654] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:32.422717 28364 catalog_manager.cc:5002] TS d50b95c3eafb4e4ea0f3e4ec97a791ef (127.25.254.193:42015): tablet 1edba6abea4143208a22fafd73fe0ae1 (table test-table [id=bc68e4af9cf944aa84c1cfdb4ea41da1]) successfully deleted
I20260504 14:07:32.423255 28889 tablet_bootstrap.cc:492] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6: No bootstrap required, opened a new log
I20260504 14:07:32.423338 28889 ts_tablet_manager.cc:1403] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6: Time spent bootstrapping tablet: real 0.002s	user 0.001s	sys 0.000s
I20260504 14:07:32.423830 28889 raft_consensus.cc:359] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:32.423933 28889 raft_consensus.cc:385] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:32.423966 28889 raft_consensus.cc:740] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 8bba0484174a4c3bab6791a586ccd1b6, State: Initialized, Role: FOLLOWER
I20260504 14:07:32.424079 28889 consensus_queue.cc:260] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:32.424327 28889 ts_tablet_manager.cc:1434] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:07:32.424970 28891 tablet_bootstrap.cc:492] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53: Bootstrap starting.
I20260504 14:07:32.426101 28891 tablet_bootstrap.cc:654] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:32.427680 28891 tablet_bootstrap.cc:492] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53: No bootstrap required, opened a new log
I20260504 14:07:32.427778 28891 ts_tablet_manager.cc:1403] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53: Time spent bootstrapping tablet: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:07:32.428239 28891 raft_consensus.cc:359] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:32.428361 28891 raft_consensus.cc:385] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:32.428417 28891 raft_consensus.cc:740] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: bbbce19d6ac948a1ba8dfbc4e8aebe53, State: Initialized, Role: FOLLOWER
I20260504 14:07:32.428551 28891 consensus_queue.cc:260] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:32.428926 28891 ts_tablet_manager.cc:1434] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:07:32.594100 28897 raft_consensus.cc:493] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:07:32.594305 28897 raft_consensus.cc:515] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:32.595641 28897 leader_election.cc:290] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers d50b95c3eafb4e4ea0f3e4ec97a791ef (127.25.254.193:42015), 8bba0484174a4c3bab6791a586ccd1b6 (127.25.254.195:37501)
I20260504 14:07:32.599315 28937 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.595907 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:37501 (local address 127.25.254.194:50281)
0504 14:07:32.596344 (+   437us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:32.596362 (+    18us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:32.596541 (+   179us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:32.596840 (+   299us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:32.596844 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:32.596863 (+    19us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:32.597164 (+   301us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.597172 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.597963 (+   791us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.597968 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:32.598880 (+   912us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.598888 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.599010 (+   122us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.599031 (+    21us) client_negotiation.cc:770] Sending connection context
0504 14:07:32.599086 (+    55us) client_negotiation.cc:241] Negotiation successful
0504 14:07:32.599152 (+    66us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":359,"thread_start_us":91,"threads_started":1}
I20260504 14:07:32.599699 28887 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.596101 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:50281 (local address 127.25.254.195:37501)
0504 14:07:32.596272 (+   171us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:32.596277 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:32.596462 (+   185us) server_negotiation.cc:408] Connection header received
0504 14:07:32.596641 (+   179us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:32.596644 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:32.596727 (+    83us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:32.596807 (+    80us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:32.597300 (+   493us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.597843 (+   543us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.599016 (+  1173us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.599437 (+   421us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.599473 (+    36us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:32.599526 (+    53us) server_negotiation.cc:300] Negotiation successful
0504 14:07:32.599577 (+    51us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":87}
I20260504 14:07:32.600134 28884 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.596024 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:45883 (local address 127.25.254.193:42015)
0504 14:07:32.596162 (+   138us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:32.596167 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:32.596547 (+   380us) server_negotiation.cc:408] Connection header received
0504 14:07:32.596757 (+   210us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:32.596763 (+     6us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:32.596822 (+    59us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:32.596907 (+    85us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:32.597382 (+   475us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.598123 (+   741us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.599177 (+  1054us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.599697 (+   520us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.599731 (+    34us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:32.599935 (+   204us) server_negotiation.cc:300] Negotiation successful
0504 14:07:32.600026 (+    91us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":51}
I20260504 14:07:32.600150 28819 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fa36ddefc9054c8c94d5ec702e0eccda" candidate_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "8bba0484174a4c3bab6791a586ccd1b6" is_pre_election: true
I20260504 14:07:32.600297 28819 raft_consensus.cc:2468] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate bbbce19d6ac948a1ba8dfbc4e8aebe53 in term 0.
I20260504 14:07:32.600493 28938 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.595907 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:42015 (local address 127.25.254.194:45883)
0504 14:07:32.596434 (+   527us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:32.596448 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:32.596575 (+   127us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:32.596939 (+   364us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:32.596942 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:32.596960 (+    18us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:32.597185 (+   225us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.597191 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.598257 (+  1066us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.598260 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:32.599043 (+   783us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.599050 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.599788 (+   738us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.599800 (+    12us) client_negotiation.cc:770] Sending connection context
0504 14:07:32.600283 (+   483us) client_negotiation.cc:241] Negotiation successful
0504 14:07:32.600349 (+    66us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":452,"mutex_wait_us":254,"thread_start_us":91,"threads_started":1}
I20260504 14:07:32.600764 28609 leader_election.cc:304] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 8bba0484174a4c3bab6791a586ccd1b6, bbbce19d6ac948a1ba8dfbc4e8aebe53; no voters: 
I20260504 14:07:32.601082 28536 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fa36ddefc9054c8c94d5ec702e0eccda" candidate_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" is_pre_election: true
I20260504 14:07:32.601269 28536 raft_consensus.cc:2468] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate bbbce19d6ac948a1ba8dfbc4e8aebe53 in term 0.
I20260504 14:07:32.601500 28897 raft_consensus.cc:2804] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:07:32.601552 28897 raft_consensus.cc:493] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:07:32.601585 28897 raft_consensus.cc:3060] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:32.602818 28897 raft_consensus.cc:515] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:32.603230 28897 leader_election.cc:290] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [CANDIDATE]: Term 1 election: Requested vote from peers d50b95c3eafb4e4ea0f3e4ec97a791ef (127.25.254.193:42015), 8bba0484174a4c3bab6791a586ccd1b6 (127.25.254.195:37501)
I20260504 14:07:32.603601 28536 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fa36ddefc9054c8c94d5ec702e0eccda" candidate_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef"
I20260504 14:07:32.603624 28819 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fa36ddefc9054c8c94d5ec702e0eccda" candidate_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "8bba0484174a4c3bab6791a586ccd1b6"
I20260504 14:07:32.603730 28536 raft_consensus.cc:3060] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:32.603755 28819 raft_consensus.cc:3060] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:32.604645 28819 raft_consensus.cc:2468] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate bbbce19d6ac948a1ba8dfbc4e8aebe53 in term 1.
I20260504 14:07:32.604800 28536 raft_consensus.cc:2468] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate bbbce19d6ac948a1ba8dfbc4e8aebe53 in term 1.
I20260504 14:07:32.604983 28609 leader_election.cc:304] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 8bba0484174a4c3bab6791a586ccd1b6, bbbce19d6ac948a1ba8dfbc4e8aebe53; no voters: 
I20260504 14:07:32.605165 28897 raft_consensus.cc:2804] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:32.605392 28897 raft_consensus.cc:697] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 1 LEADER]: Becoming Leader. State: Replica: bbbce19d6ac948a1ba8dfbc4e8aebe53, State: Running, Role: LEADER
I20260504 14:07:32.605636 28897 consensus_queue.cc:237] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:32.608266 28377 catalog_manager.cc:5671] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 reported cstate change: term changed from 0 to 1, leader changed from <none> to bbbce19d6ac948a1ba8dfbc4e8aebe53 (127.25.254.194). New cstate: current_term: 1 leader_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } health_report { overall_health: UNKNOWN } } }
I20260504 14:07:32.666525 28941 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.663404 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:54340 (local address 127.25.254.194:35029)
0504 14:07:32.663640 (+   236us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:32.663644 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:32.663668 (+    24us) server_negotiation.cc:408] Connection header received
0504 14:07:32.663756 (+    88us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:32.663758 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:32.663808 (+    50us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:32.663925 (+   117us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:07:32.664250 (+   325us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.664788 (+   538us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.665420 (+   632us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.665617 (+   197us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.665675 (+    58us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:07:32.666046 (+   371us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:07:32.666171 (+   125us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:32.666276 (+   105us) server_negotiation.cc:300] Negotiation successful
0504 14:07:32.666334 (+    58us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":170,"thread_start_us":74,"threads_started":1}
I20260504 14:07:32.676441 28884 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.673637 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:53023 (local address 127.25.254.193:42015)
0504 14:07:32.673774 (+   137us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:32.673778 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:32.673905 (+   127us) server_negotiation.cc:408] Connection header received
0504 14:07:32.674015 (+   110us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:32.674017 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:32.674064 (+    47us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:32.674134 (+    70us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:07:32.674643 (+   509us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.675102 (+   459us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.675782 (+   680us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.675923 (+   141us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.675961 (+    38us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:07:32.676034 (+    73us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:07:32.676105 (+    71us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:32.676293 (+   188us) server_negotiation.cc:300] Negotiation successful
0504 14:07:32.676336 (+    43us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":49}
I20260504 14:07:32.676482 28944 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.673530 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:42015 (local address 127.25.254.194:53023)
0504 14:07:32.673830 (+   300us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:32.673842 (+    12us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:32.673931 (+    89us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:32.674253 (+   322us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:32.674256 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:32.674268 (+    12us) client_negotiation.cc:190] Negotiated authn=TOKEN
0504 14:07:32.674491 (+   223us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.674497 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.675224 (+   727us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.675226 (+     2us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:32.675657 (+   431us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.675664 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.675757 (+    93us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.675789 (+    32us) client_negotiation.cc:253] Sending TOKEN_EXCHANGE NegotiatePB request
0504 14:07:32.676158 (+   369us) client_negotiation.cc:272] Received TOKEN_EXCHANGE NegotiatePB response
0504 14:07:32.676165 (+     7us) client_negotiation.cc:770] Sending connection context
0504 14:07:32.676241 (+    76us) client_negotiation.cc:241] Negotiation successful
0504 14:07:32.676299 (+    58us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":197,"thread_start_us":106,"threads_started":1}
I20260504 14:07:32.683461 28536 raft_consensus.cc:1275] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 1 FOLLOWER]: Refusing update from remote peer bbbce19d6ac948a1ba8dfbc4e8aebe53: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:07:32.683949 28819 raft_consensus.cc:1275] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 FOLLOWER]: Refusing update from remote peer bbbce19d6ac948a1ba8dfbc4e8aebe53: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:07:32.684042 28897 consensus_queue.cc:1048] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [LEADER]: Connected to new peer: Peer: permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:07:32.684438 28939 consensus_queue.cc:1048] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [LEADER]: Connected to new peer: Peer: permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:07:32.690135 28915 mvcc.cc:204] Tried to move back new op lower bound from 7282293361385738240 to 7282293361075539968. Current Snapshot: MvccSnapshot[applied={T|T < 7282293361385738240}]
I20260504 14:07:32.704329 28941 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.701175 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:54613 (local address 127.25.254.194:35029)
0504 14:07:32.701439 (+   264us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:32.701445 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:32.701459 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:07:32.701502 (+    43us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:32.701505 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:32.701557 (+    52us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:32.701645 (+    88us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:07:32.702077 (+   432us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.702733 (+   656us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.703559 (+   826us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.703776 (+   217us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.703826 (+    50us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:07:32.703908 (+    82us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:07:32.703971 (+    63us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:32.704158 (+   187us) server_negotiation.cc:300] Negotiation successful
0504 14:07:32.704208 (+    50us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":129}
I20260504 14:07:32.704355 28926 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:32.700928 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:35029 (local address 127.25.254.193:54613)
0504 14:07:32.701100 (+   172us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:32.701116 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:32.701200 (+    84us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:32.701659 (+   459us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:32.701662 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:32.701670 (+     8us) client_negotiation.cc:190] Negotiated authn=TOKEN
0504 14:07:32.701925 (+   255us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.701935 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.702857 (+   922us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:32.702864 (+     7us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:32.703405 (+   541us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:32.703414 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:32.703550 (+   136us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:32.703577 (+    27us) client_negotiation.cc:253] Sending TOKEN_EXCHANGE NegotiatePB request
0504 14:07:32.704047 (+   470us) client_negotiation.cc:272] Received TOKEN_EXCHANGE NegotiatePB response
0504 14:07:32.704053 (+     6us) client_negotiation.cc:770] Sending connection context
0504 14:07:32.704131 (+    78us) client_negotiation.cc:241] Negotiation successful
0504 14:07:32.704225 (+    94us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":48}
W20260504 14:07:32.705327 28916 tablet_replica.cc:1307] Aborted: operation has been aborted: cancelling pending write operations
I20260504 14:07:34.280987 28375 catalog_manager.cc:2507] Servicing SoftDeleteTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:50012:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:07:34.281145 28375 catalog_manager.cc:2755] Servicing DeleteTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:50012:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:07:34.284014 28375 catalog_manager.cc:5958] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991: Sending DeleteTablet for 3 replicas of tablet fa36ddefc9054c8c94d5ec702e0eccda
I20260504 14:07:34.284701 28515 tablet_service.cc:1558] Processing DeleteTablet for tablet fa36ddefc9054c8c94d5ec702e0eccda with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:34 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:56020
I20260504 14:07:34.284798 28800 tablet_service.cc:1558] Processing DeleteTablet for tablet fa36ddefc9054c8c94d5ec702e0eccda with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:34 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:35538
I20260504 14:07:34.285157 28970 tablet_replica.cc:333] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef: stopping tablet replica
I20260504 14:07:34.285143 28658 tablet_service.cc:1558] Processing DeleteTablet for tablet fa36ddefc9054c8c94d5ec702e0eccda with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:34 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:54334
I20260504 14:07:34.285284 28970 raft_consensus.cc:2243] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:07:34.285456 28970 raft_consensus.cc:2272] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:07:34.285530 28972 tablet_replica.cc:333] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53: stopping tablet replica
I20260504 14:07:34.285679 28972 raft_consensus.cc:2243] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 1 LEADER]: Raft consensus shutting down.
I20260504 14:07:34.285698 28971 tablet_replica.cc:333] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6: stopping tablet replica
I20260504 14:07:34.285802 28971 raft_consensus.cc:2243] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:07:34.286024 28972 raft_consensus.cc:2272] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:07:34.286144 28971 raft_consensus.cc:2272] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:07:34.286773 28972 ts_tablet_manager.cc:1916] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:07:34.286855 28971 ts_tablet_manager.cc:1916] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:07:34.288504 28970 ts_tablet_manager.cc:1916] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:07:34.289223 28972 ts_tablet_manager.cc:1929] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.8
I20260504 14:07:34.289223 28971 ts_tablet_manager.cc:1929] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.8
I20260504 14:07:34.289307 28971 log.cc:1199] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/wal/wals/fa36ddefc9054c8c94d5ec702e0eccda
I20260504 14:07:34.289309 28972 log.cc:1199] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/wal/wals/fa36ddefc9054c8c94d5ec702e0eccda
I20260504 14:07:34.289633 28971 ts_tablet_manager.cc:1950] T fa36ddefc9054c8c94d5ec702e0eccda P 8bba0484174a4c3bab6791a586ccd1b6: Deleting consensus metadata
I20260504 14:07:34.289641 28972 ts_tablet_manager.cc:1950] T fa36ddefc9054c8c94d5ec702e0eccda P bbbce19d6ac948a1ba8dfbc4e8aebe53: Deleting consensus metadata
I20260504 14:07:34.290596 28361 catalog_manager.cc:5002] TS 8bba0484174a4c3bab6791a586ccd1b6 (127.25.254.195:37501): tablet fa36ddefc9054c8c94d5ec702e0eccda (table test-table [id=5c9d6f7e10fd4d94bf4700d912ec655d]) successfully deleted
I20260504 14:07:34.290596 28362 catalog_manager.cc:5002] TS bbbce19d6ac948a1ba8dfbc4e8aebe53 (127.25.254.194:35029): tablet fa36ddefc9054c8c94d5ec702e0eccda (table test-table [id=5c9d6f7e10fd4d94bf4700d912ec655d]) successfully deleted
I20260504 14:07:34.291630 28970 ts_tablet_manager.cc:1929] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.8
I20260504 14:07:34.291739 28970 log.cc:1199] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/wal/wals/fa36ddefc9054c8c94d5ec702e0eccda
I20260504 14:07:34.292057 28970 ts_tablet_manager.cc:1950] T fa36ddefc9054c8c94d5ec702e0eccda P d50b95c3eafb4e4ea0f3e4ec97a791ef: Deleting consensus metadata
I20260504 14:07:34.292802 28364 catalog_manager.cc:5002] TS d50b95c3eafb4e4ea0f3e4ec97a791ef (127.25.254.193:42015): tablet fa36ddefc9054c8c94d5ec702e0eccda (table test-table [id=5c9d6f7e10fd4d94bf4700d912ec655d]) successfully deleted
May 04 14:07:34 dist-test-slave-2x32 krb5kdc[28329](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903654, etypes {rep=17 tkt=17 ses=17}, test-user@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-user@KRBTEST.COM: 
May 04 14:07:34 dist-test-slave-2x32 krb5kdc[28329](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903654, etypes {rep=17 tkt=17 ses=17}, test-user@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:34.313946 28980 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:34.304827 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:50042 (local address 127.25.254.254:44627)
0504 14:07:34.305125 (+   298us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:34.305128 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:34.305145 (+    17us) server_negotiation.cc:408] Connection header received
0504 14:07:34.305189 (+    44us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:34.305192 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:34.305237 (+    45us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:34.305320 (+    83us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:34.306236 (+   916us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.306825 (+   589us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:34.307519 (+   694us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.307663 (+   144us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:34.310058 (+  2395us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:34.310081 (+    23us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:34.310083 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:34.310113 (+    30us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:34.312146 (+  2033us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:34.312629 (+   483us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:34.312636 (+     7us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:34.312641 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:34.312706 (+    65us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:34.312982 (+   276us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:34.312984 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:34.312986 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:34.313135 (+   149us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:34.313239 (+   104us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:34.313574 (+   335us) server_negotiation.cc:300] Negotiation successful
0504 14:07:34.313707 (+   133us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":220,"thread_start_us":131,"threads_started":1}
I20260504 14:07:34.316757 28375 catalog_manager.cc:2257] Servicing CreateTable request from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:50042:
name: "test-table"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "val"
    type: INT32
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
W20260504 14:07:34.317170 28375 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-table in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260504 14:07:34.322270 28515 tablet_service.cc:1511] Processing CreateTablet for tablet b8ae0360dbdc4768b7c474e1538f591a (DEFAULT_TABLE table=test-table [id=4d8d47aacf014db393bf26ea35580c87]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:34.322440 28658 tablet_service.cc:1511] Processing CreateTablet for tablet b8ae0360dbdc4768b7c474e1538f591a (DEFAULT_TABLE table=test-table [id=4d8d47aacf014db393bf26ea35580c87]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:34.322616 28515 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet b8ae0360dbdc4768b7c474e1538f591a. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:34.322587 28800 tablet_service.cc:1511] Processing CreateTablet for tablet b8ae0360dbdc4768b7c474e1538f591a (DEFAULT_TABLE table=test-table [id=4d8d47aacf014db393bf26ea35580c87]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:34.322737 28658 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet b8ae0360dbdc4768b7c474e1538f591a. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:34.322837 28800 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet b8ae0360dbdc4768b7c474e1538f591a. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:34.324563 28981 tablet_bootstrap.cc:492] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53: Bootstrap starting.
I20260504 14:07:34.325026 28983 tablet_bootstrap.cc:492] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6: Bootstrap starting.
I20260504 14:07:34.325515 28982 tablet_bootstrap.cc:492] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef: Bootstrap starting.
I20260504 14:07:34.325779 28981 tablet_bootstrap.cc:654] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:34.325945 28983 tablet_bootstrap.cc:654] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:34.326543 28982 tablet_bootstrap.cc:654] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:34.327009 28981 tablet_bootstrap.cc:492] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53: No bootstrap required, opened a new log
I20260504 14:07:34.327123 28981 ts_tablet_manager.cc:1403] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53: Time spent bootstrapping tablet: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:07:34.327591 28981 raft_consensus.cc:359] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:34.327742 28981 raft_consensus.cc:385] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:34.327760 28982 tablet_bootstrap.cc:492] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef: No bootstrap required, opened a new log
I20260504 14:07:34.327792 28981 raft_consensus.cc:740] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: bbbce19d6ac948a1ba8dfbc4e8aebe53, State: Initialized, Role: FOLLOWER
I20260504 14:07:34.327836 28982 ts_tablet_manager.cc:1403] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef: Time spent bootstrapping tablet: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:07:34.327934 28981 consensus_queue.cc:260] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:34.328231 28981 ts_tablet_manager.cc:1434] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:07:34.328302 28982 raft_consensus.cc:359] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:34.328419 28982 raft_consensus.cc:385] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:34.328464 28982 raft_consensus.cc:740] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: d50b95c3eafb4e4ea0f3e4ec97a791ef, State: Initialized, Role: FOLLOWER
I20260504 14:07:34.328603 28982 consensus_queue.cc:260] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:34.328936 28982 ts_tablet_manager.cc:1434] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:07:34.329205 28983 tablet_bootstrap.cc:492] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6: No bootstrap required, opened a new log
I20260504 14:07:34.329321 28983 ts_tablet_manager.cc:1403] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6: Time spent bootstrapping tablet: real 0.004s	user 0.000s	sys 0.001s
I20260504 14:07:34.329888 28983 raft_consensus.cc:359] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:34.330016 28983 raft_consensus.cc:385] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:34.330049 28983 raft_consensus.cc:740] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 8bba0484174a4c3bab6791a586ccd1b6, State: Initialized, Role: FOLLOWER
I20260504 14:07:34.330258 28983 consensus_queue.cc:260] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:34.330698 28983 ts_tablet_manager.cc:1434] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.002s
I20260504 14:07:34.629247 28987 raft_consensus.cc:493] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:07:34.629392 28987 raft_consensus.cc:515] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:34.630416 28987 leader_election.cc:290] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers d50b95c3eafb4e4ea0f3e4ec97a791ef (127.25.254.193:42015), bbbce19d6ac948a1ba8dfbc4e8aebe53 (127.25.254.194:35029)
I20260504 14:07:34.634752 28989 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:34.630762 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:35029 (local address 127.25.254.195:38439)
0504 14:07:34.631237 (+   475us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:34.631251 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:34.631360 (+   109us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:34.631693 (+   333us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:34.631697 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:34.631738 (+    41us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:34.632074 (+   336us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:34.632083 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.633240 (+  1157us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:34.633245 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:34.634241 (+   996us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:34.634252 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.634399 (+   147us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:34.634421 (+    22us) client_negotiation.cc:770] Sending connection context
0504 14:07:34.634478 (+    57us) client_negotiation.cc:241] Negotiation successful
0504 14:07:34.634541 (+    63us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":412,"mutex_wait_us":54,"spinlock_wait_cycles":29696,"thread_start_us":127,"threads_started":1}
I20260504 14:07:34.634750 28988 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:34.630762 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:42015 (local address 127.25.254.195:42151)
0504 14:07:34.631198 (+   436us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:34.631214 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:34.631348 (+   134us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:34.631693 (+   345us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:34.631697 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:34.631736 (+    39us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:34.632074 (+   338us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:34.632083 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.633240 (+  1157us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:34.633245 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:34.634241 (+   996us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:34.634252 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.634399 (+   147us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:34.634421 (+    22us) client_negotiation.cc:770] Sending connection context
0504 14:07:34.634478 (+    57us) client_negotiation.cc:241] Negotiation successful
0504 14:07:34.634541 (+    63us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":349,"spinlock_wait_cycles":2048,"thread_start_us":125,"threads_started":1}
I20260504 14:07:34.635203 28990 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:34.630865 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:42151 (local address 127.25.254.193:42015)
0504 14:07:34.631171 (+   306us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:34.631176 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:34.631318 (+   142us) server_negotiation.cc:408] Connection header received
0504 14:07:34.631475 (+   157us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:34.631479 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:34.631536 (+    57us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:34.631670 (+   134us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:34.632251 (+   581us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.633081 (+   830us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:34.634408 (+  1327us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.634910 (+   502us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:34.634961 (+    51us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:34.635023 (+    62us) server_negotiation.cc:300] Negotiation successful
0504 14:07:34.635074 (+    51us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":210,"thread_start_us":91,"threads_started":1}
I20260504 14:07:34.635687 28991 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:34.630964 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:38439 (local address 127.25.254.194:35029)
0504 14:07:34.631234 (+   270us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:34.631244 (+    10us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:34.631319 (+    75us) server_negotiation.cc:408] Connection header received
0504 14:07:34.631475 (+   156us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:34.631479 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:34.631536 (+    57us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:34.631649 (+   113us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:34.632251 (+   602us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.633080 (+   829us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:34.634453 (+  1373us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.635375 (+   922us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:34.635419 (+    44us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:34.635517 (+    98us) server_negotiation.cc:300] Negotiation successful
0504 14:07:34.635580 (+    63us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":174,"thread_start_us":84,"threads_started":1}
I20260504 14:07:34.635764 28536 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "b8ae0360dbdc4768b7c474e1538f591a" candidate_uuid: "8bba0484174a4c3bab6791a586ccd1b6" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" is_pre_election: true
I20260504 14:07:34.635951 28536 raft_consensus.cc:2468] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 8bba0484174a4c3bab6791a586ccd1b6 in term 0.
I20260504 14:07:34.636118 28662 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "b8ae0360dbdc4768b7c474e1538f591a" candidate_uuid: "8bba0484174a4c3bab6791a586ccd1b6" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" is_pre_election: true
I20260504 14:07:34.636271 28662 raft_consensus.cc:2468] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 8bba0484174a4c3bab6791a586ccd1b6 in term 0.
I20260504 14:07:34.636391 28754 leader_election.cc:304] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 8bba0484174a4c3bab6791a586ccd1b6, d50b95c3eafb4e4ea0f3e4ec97a791ef; no voters: 
I20260504 14:07:34.636744 28987 raft_consensus.cc:2804] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:07:34.636802 28987 raft_consensus.cc:493] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:07:34.636832 28987 raft_consensus.cc:3060] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:34.637781 28987 raft_consensus.cc:515] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:34.638242 28987 leader_election.cc:290] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [CANDIDATE]: Term 1 election: Requested vote from peers d50b95c3eafb4e4ea0f3e4ec97a791ef (127.25.254.193:42015), bbbce19d6ac948a1ba8dfbc4e8aebe53 (127.25.254.194:35029)
I20260504 14:07:34.638581 28536 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "b8ae0360dbdc4768b7c474e1538f591a" candidate_uuid: "8bba0484174a4c3bab6791a586ccd1b6" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef"
I20260504 14:07:34.638638 28662 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "b8ae0360dbdc4768b7c474e1538f591a" candidate_uuid: "8bba0484174a4c3bab6791a586ccd1b6" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53"
I20260504 14:07:34.638736 28662 raft_consensus.cc:3060] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:34.638716 28536 raft_consensus.cc:3060] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:34.639551 28662 raft_consensus.cc:2468] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 8bba0484174a4c3bab6791a586ccd1b6 in term 1.
I20260504 14:07:34.639668 28536 raft_consensus.cc:2468] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 8bba0484174a4c3bab6791a586ccd1b6 in term 1.
I20260504 14:07:34.639904 28752 leader_election.cc:304] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 8bba0484174a4c3bab6791a586ccd1b6, bbbce19d6ac948a1ba8dfbc4e8aebe53; no voters: 
I20260504 14:07:34.640075 28987 raft_consensus.cc:2804] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:34.640288 28987 raft_consensus.cc:697] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 LEADER]: Becoming Leader. State: Replica: 8bba0484174a4c3bab6791a586ccd1b6, State: Running, Role: LEADER
I20260504 14:07:34.640529 28987 consensus_queue.cc:237] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:34.643079 28377 catalog_manager.cc:5671] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 reported cstate change: term changed from 0 to 1, leader changed from <none> to 8bba0484174a4c3bab6791a586ccd1b6 (127.25.254.195). New cstate: current_term: 1 leader_uuid: "8bba0484174a4c3bab6791a586ccd1b6" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } health_report { overall_health: HEALTHY } } }
I20260504 14:07:34.704813 28995 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:34.701469 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:35542 (local address 127.25.254.195:37501)
0504 14:07:34.701745 (+   276us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:34.701748 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:34.701765 (+    17us) server_negotiation.cc:408] Connection header received
0504 14:07:34.701835 (+    70us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:34.701838 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:34.701894 (+    56us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:34.702019 (+   125us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:07:34.702443 (+   424us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.702957 (+   514us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:34.703688 (+   731us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.703888 (+   200us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:34.703968 (+    80us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:07:34.704399 (+   431us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:07:34.704524 (+   125us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:34.704590 (+    66us) server_negotiation.cc:300] Negotiation successful
0504 14:07:34.704637 (+    47us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":200,"thread_start_us":104,"threads_started":1}
I20260504 14:07:34.715301 28998 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:34.712100 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:42015 (local address 127.25.254.195:60503)
0504 14:07:34.712502 (+   402us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:34.712518 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:34.712639 (+   121us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:34.712921 (+   282us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:34.712923 (+     2us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:34.712932 (+     9us) client_negotiation.cc:190] Negotiated authn=TOKEN
0504 14:07:34.713192 (+   260us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:34.713197 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.713978 (+   781us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:34.713983 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:34.714450 (+   467us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:34.714457 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.714589 (+   132us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:34.714632 (+    43us) client_negotiation.cc:253] Sending TOKEN_EXCHANGE NegotiatePB request
0504 14:07:34.714986 (+   354us) client_negotiation.cc:272] Received TOKEN_EXCHANGE NegotiatePB response
0504 14:07:34.714997 (+    11us) client_negotiation.cc:770] Sending connection context
0504 14:07:34.715080 (+    83us) client_negotiation.cc:241] Negotiation successful
0504 14:07:34.715150 (+    70us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":195,"thread_start_us":124,"threads_started":1}
I20260504 14:07:34.715287 28990 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:34.712355 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:60503 (local address 127.25.254.193:42015)
0504 14:07:34.712507 (+   152us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:34.712514 (+     7us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:34.712600 (+    86us) server_negotiation.cc:408] Connection header received
0504 14:07:34.712764 (+   164us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:34.712769 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:34.712823 (+    54us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:34.712908 (+    85us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:07:34.713312 (+   404us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.713825 (+   513us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:34.714573 (+   748us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.714783 (+   210us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:34.714821 (+    38us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:07:34.714891 (+    70us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:07:34.714978 (+    87us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:34.715103 (+   125us) server_negotiation.cc:300] Negotiation successful
0504 14:07:34.715146 (+    43us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":64}
I20260504 14:07:34.722216 28662 raft_consensus.cc:1275] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 1 FOLLOWER]: Refusing update from remote peer 8bba0484174a4c3bab6791a586ccd1b6: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:07:34.722265 28536 raft_consensus.cc:1275] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 1 FOLLOWER]: Refusing update from remote peer 8bba0484174a4c3bab6791a586ccd1b6: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:07:34.722853 28992 consensus_queue.cc:1048] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [LEADER]: Connected to new peer: Peer: permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:07:34.723040 28987 consensus_queue.cc:1048] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [LEADER]: Connected to new peer: Peer: permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:07:34.743767 28995 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:34.740220 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:43923 (local address 127.25.254.195:37501)
0504 14:07:34.740517 (+   297us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:34.740523 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:34.740798 (+   275us) server_negotiation.cc:408] Connection header received
0504 14:07:34.740845 (+    47us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:34.740848 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:34.740902 (+    54us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:34.740979 (+    77us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:07:34.741519 (+   540us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.742027 (+   508us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:34.742728 (+   701us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.742960 (+   232us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:34.743009 (+    49us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:07:34.743110 (+   101us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:07:34.743187 (+    77us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:34.743582 (+   395us) server_negotiation.cc:300] Negotiation successful
0504 14:07:34.743631 (+    49us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":227}
I20260504 14:07:34.743770 29003 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:34.740186 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:37501 (local address 127.25.254.193:43923)
0504 14:07:34.740508 (+   322us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:34.740523 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:34.740634 (+   111us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:34.741114 (+   480us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:34.741117 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:34.741127 (+    10us) client_negotiation.cc:190] Negotiated authn=TOKEN
0504 14:07:34.741393 (+   266us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:34.741402 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.742142 (+   740us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:34.742146 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:34.742602 (+   456us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:34.742609 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:34.742744 (+   135us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:34.742769 (+    25us) client_negotiation.cc:253] Sending TOKEN_EXCHANGE NegotiatePB request
0504 14:07:34.743466 (+   697us) client_negotiation.cc:272] Received TOKEN_EXCHANGE NegotiatePB response
0504 14:07:34.743473 (+     7us) client_negotiation.cc:770] Sending connection context
0504 14:07:34.743544 (+    71us) client_negotiation.cc:241] Negotiation successful
0504 14:07:34.743612 (+    68us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":221,"thread_start_us":160,"threads_started":1}
I20260504 14:07:34.769579 28376 catalog_manager.cc:2507] Servicing SoftDeleteTable request from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:50042:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:07:34.769789 28376 catalog_manager.cc:2755] Servicing DeleteTable request from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:50042:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:07:34.772150 28376 catalog_manager.cc:5958] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991: Sending DeleteTablet for 3 replicas of tablet b8ae0360dbdc4768b7c474e1538f591a
I20260504 14:07:34.772915 28800 tablet_service.cc:1558] Processing DeleteTablet for tablet b8ae0360dbdc4768b7c474e1538f591a with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:34 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:35538
I20260504 14:07:34.773061 28658 tablet_service.cc:1558] Processing DeleteTablet for tablet b8ae0360dbdc4768b7c474e1538f591a with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:34 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:54334
I20260504 14:07:34.773101 28971 tablet_replica.cc:333] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6: stopping tablet replica
I20260504 14:07:34.773211 28971 raft_consensus.cc:2243] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 LEADER]: Raft consensus shutting down.
I20260504 14:07:34.773227 28972 tablet_replica.cc:333] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53: stopping tablet replica
I20260504 14:07:34.773319 28972 raft_consensus.cc:2243] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:07:34.773527 28972 raft_consensus.cc:2272] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:07:34.773520 28971 raft_consensus.cc:2272] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:07:34.773717 28515 tablet_service.cc:1558] Processing DeleteTablet for tablet b8ae0360dbdc4768b7c474e1538f591a with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:34 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:56020
I20260504 14:07:34.773910 28970 tablet_replica.cc:333] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef: stopping tablet replica
I20260504 14:07:34.774041 28970 raft_consensus.cc:2243] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:07:34.774184 28970 raft_consensus.cc:2272] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:07:34.774462 28971 ts_tablet_manager.cc:1916] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:07:34.774463 28972 ts_tablet_manager.cc:1916] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:07:34.777086 28376 catalog_manager.cc:2257] Servicing CreateTable request from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:50042:
name: "test-table"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "val"
    type: INT32
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
I20260504 14:07:34.777392 28970 ts_tablet_manager.cc:1916] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef: Deleting tablet data with delete state TABLET_DATA_DELETED
W20260504 14:07:34.777619 28376 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-table in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260504 14:07:34.778517 28972 ts_tablet_manager.cc:1929] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.5
I20260504 14:07:34.778596 28972 log.cc:1199] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-1/wal/wals/b8ae0360dbdc4768b7c474e1538f591a
I20260504 14:07:34.779114 28972 ts_tablet_manager.cc:1950] T b8ae0360dbdc4768b7c474e1538f591a P bbbce19d6ac948a1ba8dfbc4e8aebe53: Deleting consensus metadata
I20260504 14:07:34.779872 28362 catalog_manager.cc:5002] TS bbbce19d6ac948a1ba8dfbc4e8aebe53 (127.25.254.194:35029): tablet b8ae0360dbdc4768b7c474e1538f591a (table test-table [id=4d8d47aacf014db393bf26ea35580c87]) successfully deleted
I20260504 14:07:34.780011 28971 ts_tablet_manager.cc:1929] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.5
I20260504 14:07:34.780081 28971 log.cc:1199] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/wal/wals/b8ae0360dbdc4768b7c474e1538f591a
I20260504 14:07:34.780352 28971 ts_tablet_manager.cc:1950] T b8ae0360dbdc4768b7c474e1538f591a P 8bba0484174a4c3bab6791a586ccd1b6: Deleting consensus metadata
I20260504 14:07:34.780622 28970 ts_tablet_manager.cc:1929] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.5
I20260504 14:07:34.780687 28970 log.cc:1199] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-0/wal/wals/b8ae0360dbdc4768b7c474e1538f591a
I20260504 14:07:34.781136 28361 catalog_manager.cc:5002] TS 8bba0484174a4c3bab6791a586ccd1b6 (127.25.254.195:37501): tablet b8ae0360dbdc4768b7c474e1538f591a (table test-table [id=4d8d47aacf014db393bf26ea35580c87]) successfully deleted
I20260504 14:07:34.780925 28970 ts_tablet_manager.cc:1950] T b8ae0360dbdc4768b7c474e1538f591a P d50b95c3eafb4e4ea0f3e4ec97a791ef: Deleting consensus metadata
I20260504 14:07:34.781917 28364 catalog_manager.cc:5002] TS d50b95c3eafb4e4ea0f3e4ec97a791ef (127.25.254.193:42015): tablet b8ae0360dbdc4768b7c474e1538f591a (table test-table [id=4d8d47aacf014db393bf26ea35580c87]) successfully deleted
I20260504 14:07:34.786129 28515 tablet_service.cc:1511] Processing CreateTablet for tablet c72ee6707a0c488f8756badb672d19a3 (DEFAULT_TABLE table=test-table [id=1aed3dcacbdc4070a39cc5601b9a73ca]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:34.786229 28800 tablet_service.cc:1511] Processing CreateTablet for tablet c72ee6707a0c488f8756badb672d19a3 (DEFAULT_TABLE table=test-table [id=1aed3dcacbdc4070a39cc5601b9a73ca]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:34.786499 28515 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet c72ee6707a0c488f8756badb672d19a3. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:34.786497 28800 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet c72ee6707a0c488f8756badb672d19a3. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:34.788794 28983 tablet_bootstrap.cc:492] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6: Bootstrap starting.
I20260504 14:07:34.788810 28982 tablet_bootstrap.cc:492] T c72ee6707a0c488f8756badb672d19a3 P d50b95c3eafb4e4ea0f3e4ec97a791ef: Bootstrap starting.
I20260504 14:07:34.789525 28658 tablet_service.cc:1511] Processing CreateTablet for tablet c72ee6707a0c488f8756badb672d19a3 (DEFAULT_TABLE table=test-table [id=1aed3dcacbdc4070a39cc5601b9a73ca]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:34.789822 28658 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet c72ee6707a0c488f8756badb672d19a3. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:34.789840 28983 tablet_bootstrap.cc:654] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:34.789846 28982 tablet_bootstrap.cc:654] T c72ee6707a0c488f8756badb672d19a3 P d50b95c3eafb4e4ea0f3e4ec97a791ef: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:34.791109 28983 tablet_bootstrap.cc:492] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6: No bootstrap required, opened a new log
I20260504 14:07:34.791188 28983 ts_tablet_manager.cc:1403] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6: Time spent bootstrapping tablet: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:07:34.791661 28983 raft_consensus.cc:359] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:34.791778 28983 raft_consensus.cc:385] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:34.791810 28983 raft_consensus.cc:740] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 8bba0484174a4c3bab6791a586ccd1b6, State: Initialized, Role: FOLLOWER
I20260504 14:07:34.791940 28983 consensus_queue.cc:260] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:34.792110 28982 tablet_bootstrap.cc:492] T c72ee6707a0c488f8756badb672d19a3 P d50b95c3eafb4e4ea0f3e4ec97a791ef: No bootstrap required, opened a new log
I20260504 14:07:34.792194 28982 ts_tablet_manager.cc:1403] T c72ee6707a0c488f8756badb672d19a3 P d50b95c3eafb4e4ea0f3e4ec97a791ef: Time spent bootstrapping tablet: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:07:34.792618 28982 raft_consensus.cc:359] T c72ee6707a0c488f8756badb672d19a3 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:34.792737 28982 raft_consensus.cc:385] T c72ee6707a0c488f8756badb672d19a3 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:34.792765 28982 raft_consensus.cc:740] T c72ee6707a0c488f8756badb672d19a3 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: d50b95c3eafb4e4ea0f3e4ec97a791ef, State: Initialized, Role: FOLLOWER
I20260504 14:07:34.792876 28982 consensus_queue.cc:260] T c72ee6707a0c488f8756badb672d19a3 P d50b95c3eafb4e4ea0f3e4ec97a791ef [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:34.793144 28982 ts_tablet_manager.cc:1434] T c72ee6707a0c488f8756badb672d19a3 P d50b95c3eafb4e4ea0f3e4ec97a791ef: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:07:34.793669 28983 ts_tablet_manager.cc:1434] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6: Time spent starting tablet: real 0.002s	user 0.001s	sys 0.000s
I20260504 14:07:34.795730 28981 tablet_bootstrap.cc:492] T c72ee6707a0c488f8756badb672d19a3 P bbbce19d6ac948a1ba8dfbc4e8aebe53: Bootstrap starting.
I20260504 14:07:34.796806 28981 tablet_bootstrap.cc:654] T c72ee6707a0c488f8756badb672d19a3 P bbbce19d6ac948a1ba8dfbc4e8aebe53: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:34.797919 28981 tablet_bootstrap.cc:492] T c72ee6707a0c488f8756badb672d19a3 P bbbce19d6ac948a1ba8dfbc4e8aebe53: No bootstrap required, opened a new log
I20260504 14:07:34.797993 28981 ts_tablet_manager.cc:1403] T c72ee6707a0c488f8756badb672d19a3 P bbbce19d6ac948a1ba8dfbc4e8aebe53: Time spent bootstrapping tablet: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:07:34.798503 28981 raft_consensus.cc:359] T c72ee6707a0c488f8756badb672d19a3 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:34.798601 28981 raft_consensus.cc:385] T c72ee6707a0c488f8756badb672d19a3 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:34.798624 28981 raft_consensus.cc:740] T c72ee6707a0c488f8756badb672d19a3 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: bbbce19d6ac948a1ba8dfbc4e8aebe53, State: Initialized, Role: FOLLOWER
I20260504 14:07:34.798774 28981 consensus_queue.cc:260] T c72ee6707a0c488f8756badb672d19a3 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:34.799063 28981 ts_tablet_manager.cc:1434] T c72ee6707a0c488f8756badb672d19a3 P bbbce19d6ac948a1ba8dfbc4e8aebe53: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:07:34.823217 29001 raft_consensus.cc:493] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:07:34.823379 29001 raft_consensus.cc:515] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:34.823869 29001 leader_election.cc:290] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers d50b95c3eafb4e4ea0f3e4ec97a791ef (127.25.254.193:42015), bbbce19d6ac948a1ba8dfbc4e8aebe53 (127.25.254.194:35029)
I20260504 14:07:34.824350 28662 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "c72ee6707a0c488f8756badb672d19a3" candidate_uuid: "8bba0484174a4c3bab6791a586ccd1b6" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" is_pre_election: true
I20260504 14:07:34.824395 28536 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "c72ee6707a0c488f8756badb672d19a3" candidate_uuid: "8bba0484174a4c3bab6791a586ccd1b6" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" is_pre_election: true
I20260504 14:07:34.824504 28662 raft_consensus.cc:2468] T c72ee6707a0c488f8756badb672d19a3 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 8bba0484174a4c3bab6791a586ccd1b6 in term 0.
I20260504 14:07:34.824594 28536 raft_consensus.cc:2468] T c72ee6707a0c488f8756badb672d19a3 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 8bba0484174a4c3bab6791a586ccd1b6 in term 0.
I20260504 14:07:34.824882 28752 leader_election.cc:304] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 8bba0484174a4c3bab6791a586ccd1b6, bbbce19d6ac948a1ba8dfbc4e8aebe53; no voters: 
I20260504 14:07:34.825093 29001 raft_consensus.cc:2804] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:07:34.825157 29001 raft_consensus.cc:493] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:07:34.825215 29001 raft_consensus.cc:3060] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:34.834493 29001 raft_consensus.cc:515] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:34.835696 29001 leader_election.cc:290] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [CANDIDATE]: Term 1 election: Requested vote from peers d50b95c3eafb4e4ea0f3e4ec97a791ef (127.25.254.193:42015), bbbce19d6ac948a1ba8dfbc4e8aebe53 (127.25.254.194:35029)
I20260504 14:07:34.835845 28662 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "c72ee6707a0c488f8756badb672d19a3" candidate_uuid: "8bba0484174a4c3bab6791a586ccd1b6" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53"
I20260504 14:07:34.835985 28662 raft_consensus.cc:3060] T c72ee6707a0c488f8756badb672d19a3 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:34.835999 28536 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "c72ee6707a0c488f8756badb672d19a3" candidate_uuid: "8bba0484174a4c3bab6791a586ccd1b6" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef"
I20260504 14:07:34.836133 28536 raft_consensus.cc:3060] T c72ee6707a0c488f8756badb672d19a3 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:34.837155 28662 raft_consensus.cc:2468] T c72ee6707a0c488f8756badb672d19a3 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 8bba0484174a4c3bab6791a586ccd1b6 in term 1.
I20260504 14:07:34.837545 28536 raft_consensus.cc:2468] T c72ee6707a0c488f8756badb672d19a3 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 8bba0484174a4c3bab6791a586ccd1b6 in term 1.
I20260504 14:07:34.837742 28752 leader_election.cc:304] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 8bba0484174a4c3bab6791a586ccd1b6, bbbce19d6ac948a1ba8dfbc4e8aebe53; no voters: 
I20260504 14:07:34.838040 29001 raft_consensus.cc:2804] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:34.838130 29001 raft_consensus.cc:697] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 LEADER]: Becoming Leader. State: Replica: 8bba0484174a4c3bab6791a586ccd1b6, State: Running, Role: LEADER
I20260504 14:07:34.838320 29001 consensus_queue.cc:237] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } }
I20260504 14:07:34.840423 28376 catalog_manager.cc:5671] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 reported cstate change: term changed from 0 to 1, leader changed from <none> to 8bba0484174a4c3bab6791a586ccd1b6 (127.25.254.195). New cstate: current_term: 1 leader_uuid: "8bba0484174a4c3bab6791a586ccd1b6" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "8bba0484174a4c3bab6791a586ccd1b6" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37501 } health_report { overall_health: HEALTHY } } }
I20260504 14:07:34.862031 28660 raft_consensus.cc:1275] T c72ee6707a0c488f8756badb672d19a3 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 1 FOLLOWER]: Refusing update from remote peer 8bba0484174a4c3bab6791a586ccd1b6: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:07:34.862344 28536 raft_consensus.cc:1275] T c72ee6707a0c488f8756badb672d19a3 P d50b95c3eafb4e4ea0f3e4ec97a791ef [term 1 FOLLOWER]: Refusing update from remote peer 8bba0484174a4c3bab6791a586ccd1b6: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:07:34.862639 28992 consensus_queue.cc:1048] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [LEADER]: Connected to new peer: Peer: permanent_uuid: "bbbce19d6ac948a1ba8dfbc4e8aebe53" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35029 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:07:34.862802 28987 consensus_queue.cc:1048] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [LEADER]: Connected to new peer: Peer: permanent_uuid: "d50b95c3eafb4e4ea0f3e4ec97a791ef" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42015 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
W20260504 14:07:34.879357 28954 tablet_replica.cc:1307] Aborted: operation has been aborted: cancelling pending write operations
I20260504 14:07:36.449796 28376 catalog_manager.cc:2507] Servicing SoftDeleteTable request from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:50042:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:07:36.449958 28376 catalog_manager.cc:2755] Servicing DeleteTable request from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:50042:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:07:36.452623 28376 catalog_manager.cc:5958] T 00000000000000000000000000000000 P 4fbcd8298649420c921777c7ea53a991: Sending DeleteTablet for 3 replicas of tablet c72ee6707a0c488f8756badb672d19a3
I20260504 14:07:36.453239 28515 tablet_service.cc:1558] Processing DeleteTablet for tablet c72ee6707a0c488f8756badb672d19a3 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:36 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:56020
I20260504 14:07:36.453317 28800 tablet_service.cc:1558] Processing DeleteTablet for tablet c72ee6707a0c488f8756badb672d19a3 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:36 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:35538
I20260504 14:07:36.453317 28658 tablet_service.cc:1558] Processing DeleteTablet for tablet c72ee6707a0c488f8756badb672d19a3 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:36 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:54334
I20260504 14:07:36.453697 29034 tablet_replica.cc:333] T c72ee6707a0c488f8756badb672d19a3 P bbbce19d6ac948a1ba8dfbc4e8aebe53: stopping tablet replica
I20260504 14:07:36.453730 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 28451
I20260504 14:07:36.455355 29036 tablet_replica.cc:333] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6: stopping tablet replica
I20260504 14:07:36.455394 29034 raft_consensus.cc:2243] T c72ee6707a0c488f8756badb672d19a3 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:07:36.455526 29036 raft_consensus.cc:2243] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 LEADER]: Raft consensus shutting down.
I20260504 14:07:36.455611 29034 raft_consensus.cc:2272] T c72ee6707a0c488f8756badb672d19a3 P bbbce19d6ac948a1ba8dfbc4e8aebe53 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:07:36.455760 29036 raft_consensus.cc:2272] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:07:36.456316 29034 ts_tablet_manager.cc:1916] T c72ee6707a0c488f8756badb672d19a3 P bbbce19d6ac948a1ba8dfbc4e8aebe53: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:07:36.456549 29036 ts_tablet_manager.cc:1916] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6: Deleting tablet data with delete state TABLET_DATA_DELETED
W20260504 14:07:36.462244 28364 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv got EOF from 127.25.254.193:42015 (error 108)
I20260504 14:07:36.462555 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 28593
I20260504 14:07:36.465085 29038 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:36.464683 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:42015 (local address 127.0.0.1:56038)
0504 14:07:36.464939 (+   256us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:36.464993 (+    54us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.193:42015: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":168,"thread_start_us":80,"threads_started":1}
W20260504 14:07:36.465229 28364 catalog_manager.cc:4729] TS d50b95c3eafb4e4ea0f3e4ec97a791ef (127.25.254.193:42015): DeleteTablet:TABLET_DATA_DELETED RPC failed for tablet c72ee6707a0c488f8756badb672d19a3: Network error: Client connection negotiation failed: client connection to 127.25.254.193:42015: connect: Connection refused (error 111)
I20260504 14:07:36.495043 29036 ts_tablet_manager.cc:1929] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.8
I20260504 14:07:36.495182 29036 log.cc:1199] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TxnSmokeWithDifferentUserTypes.1777903638260922-26619-0/minicluster-data/ts-2/wal/wals/c72ee6707a0c488f8756badb672d19a3
I20260504 14:07:36.495513 29036 ts_tablet_manager.cc:1950] T c72ee6707a0c488f8756badb672d19a3 P 8bba0484174a4c3bab6791a586ccd1b6: Deleting consensus metadata
I20260504 14:07:36.499135 28361 catalog_manager.cc:5002] TS 8bba0484174a4c3bab6791a586ccd1b6 (127.25.254.195:37501): tablet c72ee6707a0c488f8756badb672d19a3 (table test-table [id=1aed3dcacbdc4070a39cc5601b9a73ca]) successfully deleted
I20260504 14:07:36.500000 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 28735
I20260504 14:07:36.501762 29038 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:36.500204 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:35029 (local address 127.0.0.1:54356)
0504 14:07:36.501571 (+  1367us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:36.501643 (+    72us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.194:35029: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":1233}
I20260504 14:07:36.507371 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 28345
2026-05-04T14:07:36Z chronyd exiting
[       OK ] SecurityITest.TxnSmokeWithDifferentUserTypes (7676 ms)
[ RUN      ] SecurityITest.TestNoKerberosCredentials
Loading random data
Initializing database '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/principal' for realm 'KRBTEST.COM',
master key name 'K/M@KRBTEST.COM'
May 04 14:07:38 dist-test-slave-2x32 krb5kdc[29041](info): setting up network...
krb5kdc: setsockopt(10,IPV6_V6ONLY,1) worked
May 04 14:07:38 dist-test-slave-2x32 krb5kdc[29041](info): set up 2 sockets
May 04 14:07:38 dist-test-slave-2x32 krb5kdc[29041](info): commencing operation
krb5kdc: starting...
W20260504 14:07:40.530633 26619 mini_kdc.cc:121] Time spent starting KDC: real 3.998s	user 0.004s	sys 0.002s
WARNING: no policy specified for test-admin@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-admin@KRBTEST.COM" created.
WARNING: no policy specified for test-user@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-user@KRBTEST.COM" created.
WARNING: no policy specified for joe-interloper@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "joe-interloper@KRBTEST.COM" created.
Authenticating as principal slave/admin@KRBTEST.COM with password.
Entry for principal test-user with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/test-user.keytab.
Entry for principal test-user with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/test-user.keytab.
May 04 14:07:40 dist-test-slave-2x32 krb5kdc[29041](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903660, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
2026-05-04T14:07:40Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:07:40Z Disabled control of system clock
WARNING: no policy specified for kudu/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:40.686432 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:35581
--webserver_interface=127.25.254.254
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:44565
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:35581
--encrypt_data_at_rest=true
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:40.792372 29057 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:40.792614 29057 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:40.792668 29057 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:40.796231 29057 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:07:40.796305 29057 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:40.796330 29057 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:40.796350 29057 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:07:40.796368 29057 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:07:40.800869 29057 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:44565
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:35581
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:35581
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.29057
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:40.802045 29057 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:40.802951 29057 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:40.808881 29062 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:40.808872 29063 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:40.808981 29057 server_base.cc:1061] running on GCE node
W20260504 14:07:40.808872 29065 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:40.809616 29057 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:40.810519 29057 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:40.811681 29057 hybrid_clock.cc:648] HybridClock initialized: now 1777903660811663 us; error 29 us; skew 500 ppm
May 04 14:07:40 dist-test-slave-2x32 krb5kdc[29041](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903660, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:40.814406 29057 init.cc:377] Logged in from keytab as kudu/127.25.254.254@KRBTEST.COM (short username kudu)
I20260504 14:07:40.815649 29057 webserver.cc:492] Webserver started at http://127.25.254.254:36747/ using document root <none> and password file <none>
I20260504 14:07:40.816250 29057 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:40.816303 29057 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:40.816536 29057 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:40.818459 29057 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "ca98b4e3319e401ab73bda2ff444f1c7"
format_stamp: "Formatted at 2026-05-04 14:07:40 on dist-test-slave-2x32"
server_key: "7f41339dbee15165a43b1a81866b3835"
server_key_iv: "c280433ce0dc8692dcf5737839310a0c"
server_key_version: "encryptionkey@0"
I20260504 14:07:40.819008 29057 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "ca98b4e3319e401ab73bda2ff444f1c7"
format_stamp: "Formatted at 2026-05-04 14:07:40 on dist-test-slave-2x32"
server_key: "7f41339dbee15165a43b1a81866b3835"
server_key_iv: "c280433ce0dc8692dcf5737839310a0c"
server_key_version: "encryptionkey@0"
I20260504 14:07:40.822830 29057 fs_manager.cc:696] Time spent creating directory manager: real 0.004s	user 0.003s	sys 0.001s
I20260504 14:07:40.825285 29072 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:40.826565 29057 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:07:40.826679 29057 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "ca98b4e3319e401ab73bda2ff444f1c7"
format_stamp: "Formatted at 2026-05-04 14:07:40 on dist-test-slave-2x32"
server_key: "7f41339dbee15165a43b1a81866b3835"
server_key_iv: "c280433ce0dc8692dcf5737839310a0c"
server_key_version: "encryptionkey@0"
I20260504 14:07:40.826845 29057 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:40.839057 29057 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:40.842319 29057 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:40.842535 29057 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:40.851114 29057 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:35581
I20260504 14:07:40.851109 29124 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:35581 every 8 connection(s)
I20260504 14:07:40.852280 29057 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:07:40.855396 29125 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:40.861667 29125 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7: Bootstrap starting.
I20260504 14:07:40.862469 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 29057
I20260504 14:07:40.862682 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:07:40.863030 26619 external_mini_cluster.cc:1468] Setting key 556b19b794cb7b4f8e1130abac41121f
I20260504 14:07:40.864776 29125 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:40.865777 29125 log.cc:826] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:40.867776 29125 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7: No bootstrap required, opened a new log
I20260504 14:07:40.870642 29125 raft_consensus.cc:359] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "ca98b4e3319e401ab73bda2ff444f1c7" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 35581 } }
I20260504 14:07:40.870874 29125 raft_consensus.cc:385] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:40.870966 29125 raft_consensus.cc:740] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: ca98b4e3319e401ab73bda2ff444f1c7, State: Initialized, Role: FOLLOWER
May 04 14:07:40 dist-test-slave-2x32 krb5kdc[29041](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903660, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:40.871479 29125 consensus_queue.cc:260] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "ca98b4e3319e401ab73bda2ff444f1c7" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 35581 } }
I20260504 14:07:40.871632 29125 raft_consensus.cc:399] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:07:40.871722 29125 raft_consensus.cc:493] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:07:40.871846 29125 raft_consensus.cc:3060] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:40.873018 29125 raft_consensus.cc:515] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "ca98b4e3319e401ab73bda2ff444f1c7" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 35581 } }
I20260504 14:07:40.873380 29125 leader_election.cc:304] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: ca98b4e3319e401ab73bda2ff444f1c7; no voters: 
I20260504 14:07:40.873715 29125 leader_election.cc:290] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:07:40.873858 29130 raft_consensus.cc:2804] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:40.874104 29130 raft_consensus.cc:697] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7 [term 1 LEADER]: Becoming Leader. State: Replica: ca98b4e3319e401ab73bda2ff444f1c7, State: Running, Role: LEADER
I20260504 14:07:40.874493 29130 consensus_queue.cc:237] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "ca98b4e3319e401ab73bda2ff444f1c7" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 35581 } }
I20260504 14:07:40.875054 29125 sys_catalog.cc:565] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7 [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:07:40.875790 29131 sys_catalog.cc:455] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7 [sys.catalog]: SysCatalogTable state changed. Reason: New leader ca98b4e3319e401ab73bda2ff444f1c7. Latest consensus state: current_term: 1 leader_uuid: "ca98b4e3319e401ab73bda2ff444f1c7" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "ca98b4e3319e401ab73bda2ff444f1c7" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 35581 } } }
I20260504 14:07:40.876559 29131 sys_catalog.cc:458] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7 [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:40.876439 29132 sys_catalog.cc:455] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "ca98b4e3319e401ab73bda2ff444f1c7" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "ca98b4e3319e401ab73bda2ff444f1c7" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 35581 } } }
I20260504 14:07:40.876890 29132 sys_catalog.cc:458] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7 [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:40.878367 29128 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:40.864208 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:39782 (local address 127.25.254.254:35581)
0504 14:07:40.864740 (+   532us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:40.864754 (+    14us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:40.864799 (+    45us) server_negotiation.cc:408] Connection header received
0504 14:07:40.865572 (+   773us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:40.865602 (+    30us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:40.866027 (+   425us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:40.866484 (+   457us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:40.867438 (+   954us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:40.868262 (+   824us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:40.868964 (+   702us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:40.869263 (+   299us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:40.871791 (+  2528us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:40.871836 (+    45us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:40.871855 (+    19us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:40.871897 (+    42us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:40.874797 (+  2900us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:40.875379 (+   582us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:40.875385 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:40.875391 (+     6us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:40.875498 (+   107us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:40.875810 (+   312us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:40.875813 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:40.875816 (+     3us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:40.876258 (+   442us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:40.876460 (+   202us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:40.877250 (+   790us) server_negotiation.cc:300] Negotiation successful
0504 14:07:40.877498 (+   248us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":347,"thread_start_us":190,"threads_started":1}
W20260504 14:07:40.880504 29146 catalog_manager.cc:1568] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7: loading cluster ID for follower catalog manager: Not found: cluster ID entry not found
W20260504 14:07:40.880581 29146 catalog_manager.cc:883] Not found: cluster ID entry not found: failed to prepare follower catalog manager, will retry
I20260504 14:07:40.880661 29139 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:07:40.881582 29139 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:07:40.887262 29139 catalog_manager.cc:1357] Generated new cluster ID: 0f02930f375a4da684f8b68f6ad55083
I20260504 14:07:40.887354 29139 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:07:40.905717 29139 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:07:40.906674 29139 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:07:40.926891 29139 catalog_manager.cc:6044] T 00000000000000000000000000000000 P ca98b4e3319e401ab73bda2ff444f1c7: Generated new TSK 0
I20260504 14:07:40.927547 29139 catalog_manager.cc:1524] Initializing in-progress tserver states...
WARNING: no policy specified for kudu/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:40.995033 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:0
--local_ip_for_outbound_sockets=127.25.254.193
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:35581
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:44565
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:41.105383 29153 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:41.105607 29153 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:41.105666 29153 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:41.109028 29153 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:41.109107 29153 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:41.109195 29153 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:07:41.113539 29153 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:44565
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:35581
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.29153
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:41.114672 29153 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:41.115480 29153 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:41.121971 29158 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:41.121958 29159 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:41.121958 29161 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:41.122298 29153 server_base.cc:1061] running on GCE node
I20260504 14:07:41.122910 29153 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:41.123493 29153 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:41.124688 29153 hybrid_clock.cc:648] HybridClock initialized: now 1777903661124677 us; error 30 us; skew 500 ppm
May 04 14:07:41 dist-test-slave-2x32 krb5kdc[29041](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903661, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:41.127560 29153 init.cc:377] Logged in from keytab as kudu/127.25.254.193@KRBTEST.COM (short username kudu)
I20260504 14:07:41.128634 29153 webserver.cc:492] Webserver started at http://127.25.254.193:46375/ using document root <none> and password file <none>
I20260504 14:07:41.129158 29153 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:41.129204 29153 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:41.129362 29153 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:41.131279 29153 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/data/instance:
uuid: "ea528946f5c842d8897933e6f427af3f"
format_stamp: "Formatted at 2026-05-04 14:07:41 on dist-test-slave-2x32"
server_key: "a5769e764c6ebc018d0a4282265714fe"
server_key_iv: "75630e6e78899f6597e11d5bba8bfdf1"
server_key_version: "encryptionkey@0"
I20260504 14:07:41.131726 29153 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance:
uuid: "ea528946f5c842d8897933e6f427af3f"
format_stamp: "Formatted at 2026-05-04 14:07:41 on dist-test-slave-2x32"
server_key: "a5769e764c6ebc018d0a4282265714fe"
server_key_iv: "75630e6e78899f6597e11d5bba8bfdf1"
server_key_version: "encryptionkey@0"
I20260504 14:07:41.135231 29153 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.001s	sys 0.003s
I20260504 14:07:41.137584 29168 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:41.138820 29153 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.001s
I20260504 14:07:41.138926 29153 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "ea528946f5c842d8897933e6f427af3f"
format_stamp: "Formatted at 2026-05-04 14:07:41 on dist-test-slave-2x32"
server_key: "a5769e764c6ebc018d0a4282265714fe"
server_key_iv: "75630e6e78899f6597e11d5bba8bfdf1"
server_key_version: "encryptionkey@0"
I20260504 14:07:41.139009 29153 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:41.153219 29153 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:41.155845 29153 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:41.156050 29153 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:41.156665 29153 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:41.157536 29153 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:41.157608 29153 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:41.157677 29153 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:41.157720 29153 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:41.168166 29153 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:36757
I20260504 14:07:41.168184 29281 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:36757 every 8 connection(s)
I20260504 14:07:41.169176 29153 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
I20260504 14:07:41.171308 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 29153
I20260504 14:07:41.171445 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance
I20260504 14:07:41.171742 26619 external_mini_cluster.cc:1468] Setting key 8f5cb45c6644962ba72068a80c7d3ed4
May 04 14:07:41 dist-test-slave-2x32 krb5kdc[29041](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903661, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
WARNING: no policy specified for kudu/127.25.254.194@KRBTEST.COM; defaulting to no policy
I20260504 14:07:41.182787 29128 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:41.170998 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:60363 (local address 127.25.254.254:35581)
0504 14:07:41.171195 (+   197us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:41.171200 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:41.171928 (+   728us) server_negotiation.cc:408] Connection header received
0504 14:07:41.172892 (+   964us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:41.172895 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:41.172941 (+    46us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:41.173045 (+   104us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:41.174814 (+  1769us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:41.175419 (+   605us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:41.176118 (+   699us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:41.176299 (+   181us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:41.179085 (+  2786us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:41.179106 (+    21us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:41.179108 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:41.179137 (+    29us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:41.180648 (+  1511us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:41.181200 (+   552us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:41.181204 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:41.181205 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:41.181253 (+    48us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:41.181617 (+   364us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:41.181619 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:41.181621 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:41.181776 (+   155us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:41.181887 (+   111us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:41.182451 (+   564us) server_negotiation.cc:300] Negotiation successful
0504 14:07:41.182595 (+   144us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":76}
I20260504 14:07:41.183465 29284 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:41.171281 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35581 (local address 127.25.254.193:60363)
0504 14:07:41.171778 (+   497us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:41.171819 (+    41us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:41.172673 (+   854us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:41.173230 (+   557us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:41.173243 (+    13us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:41.173662 (+   419us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:41.174592 (+   930us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:41.174606 (+    14us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:41.175567 (+   961us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:41.175576 (+     9us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:41.175969 (+   393us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:41.175975 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:41.176202 (+   227us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:41.176900 (+   698us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:41.176918 (+    18us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:41.178911 (+  1993us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:41.180807 (+  1896us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:41.180814 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:41.180828 (+    14us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:41.181076 (+   248us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:41.181345 (+   269us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:41.181348 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:41.181351 (+     3us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:41.181467 (+   116us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:41.181886 (+   419us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:41.181894 (+     8us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:41.182122 (+   228us) client_negotiation.cc:770] Sending connection context
0504 14:07:41.182408 (+   286us) client_negotiation.cc:241] Negotiation successful
0504 14:07:41.182710 (+   302us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":304,"thread_start_us":175,"threads_started":1}
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.194@KRBTEST.COM" created.
I20260504 14:07:41.184862 29282 heartbeater.cc:344] Connected to a master server at 127.25.254.254:35581
I20260504 14:07:41.185143 29282 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:41.185777 29282 heartbeater.cc:507] Master 127.25.254.254:35581 requested a full tablet report, sending...
I20260504 14:07:41.187565 29089 ts_manager.cc:194] Registered new tserver with Master: ea528946f5c842d8897933e6f427af3f (127.25.254.193:36757)
I20260504 14:07:41.188830 29089 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.193@KRBTEST.COM'} at 127.25.254.193:60363
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:41.230676 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:0
--local_ip_for_outbound_sockets=127.25.254.194
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:35581
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:44565
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:41.341434 29289 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:41.341713 29289 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:41.341809 29289 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:41.345281 29289 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:41.345389 29289 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:41.345510 29289 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:07:41.350009 29289 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:44565
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:35581
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.29289
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:41.351254 29289 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:41.352156 29289 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:41.358862 29294 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:41.358850 29295 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:41.358850 29297 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:41.359259 29289 server_base.cc:1061] running on GCE node
I20260504 14:07:41.359716 29289 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:41.360289 29289 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:41.361545 29289 hybrid_clock.cc:648] HybridClock initialized: now 1777903661361512 us; error 50 us; skew 500 ppm
May 04 14:07:41 dist-test-slave-2x32 krb5kdc[29041](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903661, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:41.364379 29289 init.cc:377] Logged in from keytab as kudu/127.25.254.194@KRBTEST.COM (short username kudu)
I20260504 14:07:41.365868 29289 webserver.cc:492] Webserver started at http://127.25.254.194:41591/ using document root <none> and password file <none>
I20260504 14:07:41.366515 29289 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:41.366590 29289 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:41.366806 29289 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:41.368541 29289 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/data/instance:
uuid: "4e02dafb1ba74c11ae4a164623a77681"
format_stamp: "Formatted at 2026-05-04 14:07:41 on dist-test-slave-2x32"
server_key: "a3cf103e965db034b907ecda18e19f0c"
server_key_iv: "80a11914ee9d9d088edcf5ca2e7a716f"
server_key_version: "encryptionkey@0"
I20260504 14:07:41.369035 29289 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance:
uuid: "4e02dafb1ba74c11ae4a164623a77681"
format_stamp: "Formatted at 2026-05-04 14:07:41 on dist-test-slave-2x32"
server_key: "a3cf103e965db034b907ecda18e19f0c"
server_key_iv: "80a11914ee9d9d088edcf5ca2e7a716f"
server_key_version: "encryptionkey@0"
I20260504 14:07:41.372444 29289 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.003s	sys 0.002s
I20260504 14:07:41.374719 29304 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:41.375646 29289 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:07:41.375792 29289 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "4e02dafb1ba74c11ae4a164623a77681"
format_stamp: "Formatted at 2026-05-04 14:07:41 on dist-test-slave-2x32"
server_key: "a3cf103e965db034b907ecda18e19f0c"
server_key_iv: "80a11914ee9d9d088edcf5ca2e7a716f"
server_key_version: "encryptionkey@0"
I20260504 14:07:41.375898 29289 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:41.386934 29289 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:41.389806 29289 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:41.390018 29289 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:41.390694 29289 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:41.391598 29289 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:41.391670 29289 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:41.391741 29289 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:41.391791 29289 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:41.401401 29289 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:42127
I20260504 14:07:41.401443 29417 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:42127 every 8 connection(s)
I20260504 14:07:41.402441 29289 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
I20260504 14:07:41.406872 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 29289
I20260504 14:07:41.406965 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance
I20260504 14:07:41.407196 26619 external_mini_cluster.cc:1468] Setting key 89e53a14bc779a1e932dc6f032cbb526
May 04 14:07:41 dist-test-slave-2x32 krb5kdc[29041](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903661, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
WARNING: no policy specified for kudu/127.25.254.195@KRBTEST.COM; defaulting to no policy
I20260504 14:07:41.417526 29128 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:41.404509 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:38339 (local address 127.25.254.254:35581)
0504 14:07:41.404705 (+   196us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:41.404714 (+     9us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:41.405393 (+   679us) server_negotiation.cc:408] Connection header received
0504 14:07:41.406469 (+  1076us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:41.406473 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:41.406539 (+    66us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:41.406685 (+   146us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:41.408819 (+  2134us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:41.409561 (+   742us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:41.410360 (+   799us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:41.410600 (+   240us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:41.413252 (+  2652us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:41.413283 (+    31us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:41.413290 (+     7us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:41.413329 (+    39us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:41.415143 (+  1814us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:41.415758 (+   615us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:41.415763 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:41.415766 (+     3us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:41.415834 (+    68us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:41.416224 (+   390us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:41.416230 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:41.416234 (+     4us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:41.416456 (+   222us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:41.416578 (+   122us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:41.417140 (+   562us) server_negotiation.cc:300] Negotiation successful
0504 14:07:41.417279 (+   139us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":89}
I20260504 14:07:41.418218 29420 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:41.404769 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35581 (local address 127.25.254.194:38339)
0504 14:07:41.405224 (+   455us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:41.405260 (+    36us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:41.406138 (+   878us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:41.406903 (+   765us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:41.406912 (+     9us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:41.407469 (+   557us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:41.408630 (+  1161us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:41.408651 (+    21us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:41.409723 (+  1072us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:41.409726 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:41.410133 (+   407us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:41.410139 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:41.410433 (+   294us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:41.411256 (+   823us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:41.411276 (+    20us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:41.413049 (+  1773us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:41.415340 (+  2291us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:41.415347 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:41.415359 (+    12us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:41.415623 (+   264us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:41.415983 (+   360us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:41.415987 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:41.415988 (+     1us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:41.416109 (+   121us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:41.416586 (+   477us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:41.416591 (+     5us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:41.416838 (+   247us) client_negotiation.cc:770] Sending connection context
0504 14:07:41.417052 (+   214us) client_negotiation.cc:241] Negotiation successful
0504 14:07:41.417256 (+   204us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":274,"thread_start_us":114,"threads_started":1}
I20260504 14:07:41.419479 29418 heartbeater.cc:344] Connected to a master server at 127.25.254.254:35581
I20260504 14:07:41.419763 29418 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:41.420295 29418 heartbeater.cc:507] Master 127.25.254.254:35581 requested a full tablet report, sending...
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.195@KRBTEST.COM" created.
I20260504 14:07:41.421546 29089 ts_manager.cc:194] Registered new tserver with Master: 4e02dafb1ba74c11ae4a164623a77681 (127.25.254.194:42127)
I20260504 14:07:41.422223 29089 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.194@KRBTEST.COM'} at 127.25.254.194:38339
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:41.466547 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:0
--local_ip_for_outbound_sockets=127.25.254.195
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:35581
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:44565
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:41.577718 29425 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:41.577955 29425 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:41.578044 29425 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:41.581537 29425 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:41.581614 29425 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:41.581743 29425 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:07:41.586306 29425 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:44565
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:35581
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.29425
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:41.587517 29425 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:41.588397 29425 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:41.595017 29433 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:41.595019 29430 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:41.595077 29431 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:41.595317 29425 server_base.cc:1061] running on GCE node
I20260504 14:07:41.595949 29425 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:41.596560 29425 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:41.597752 29425 hybrid_clock.cc:648] HybridClock initialized: now 1777903661597727 us; error 51 us; skew 500 ppm
May 04 14:07:41 dist-test-slave-2x32 krb5kdc[29041](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903661, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:41.600997 29425 init.cc:377] Logged in from keytab as kudu/127.25.254.195@KRBTEST.COM (short username kudu)
I20260504 14:07:41.602250 29425 webserver.cc:492] Webserver started at http://127.25.254.195:44283/ using document root <none> and password file <none>
I20260504 14:07:41.602843 29425 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:41.602919 29425 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:41.603139 29425 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:41.604969 29425 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/data/instance:
uuid: "a510a1a3653e4b9c998d2495ebc23b80"
format_stamp: "Formatted at 2026-05-04 14:07:41 on dist-test-slave-2x32"
server_key: "05bc1e1aa667fc79e8dccae087ec74ac"
server_key_iv: "de16c3e09a232f70e74c8b3885da927b"
server_key_version: "encryptionkey@0"
I20260504 14:07:41.605461 29425 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance:
uuid: "a510a1a3653e4b9c998d2495ebc23b80"
format_stamp: "Formatted at 2026-05-04 14:07:41 on dist-test-slave-2x32"
server_key: "05bc1e1aa667fc79e8dccae087ec74ac"
server_key_iv: "de16c3e09a232f70e74c8b3885da927b"
server_key_version: "encryptionkey@0"
I20260504 14:07:41.609032 29425 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.006s	sys 0.000s
I20260504 14:07:41.611392 29440 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:41.612639 29425 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.001s	sys 0.000s
I20260504 14:07:41.612807 29425 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "a510a1a3653e4b9c998d2495ebc23b80"
format_stamp: "Formatted at 2026-05-04 14:07:41 on dist-test-slave-2x32"
server_key: "05bc1e1aa667fc79e8dccae087ec74ac"
server_key_iv: "de16c3e09a232f70e74c8b3885da927b"
server_key_version: "encryptionkey@0"
I20260504 14:07:41.612927 29425 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:41.629249 29425 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:41.632778 29425 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:41.633021 29425 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:41.633695 29425 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:41.634678 29425 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:41.634754 29425 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:41.634824 29425 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:41.634871 29425 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:41.645140 29425 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:46179
I20260504 14:07:41.645205 29553 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:46179 every 8 connection(s)
I20260504 14:07:41.646255 29425 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
I20260504 14:07:41.653607 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 29425
I20260504 14:07:41.653769 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNoKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance
I20260504 14:07:41.654052 26619 external_mini_cluster.cc:1468] Setting key 2f9634308c4dd653c2f6e0caadc65e86
May 04 14:07:41 dist-test-slave-2x32 krb5kdc[29041](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903661, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:41.659425 29128 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:41.648113 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:37573 (local address 127.25.254.254:35581)
0504 14:07:41.648280 (+   167us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:41.648285 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:41.649102 (+   817us) server_negotiation.cc:408] Connection header received
0504 14:07:41.649993 (+   891us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:41.649996 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:41.650054 (+    58us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:41.650259 (+   205us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:41.651843 (+  1584us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:41.652350 (+   507us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:41.653032 (+   682us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:41.653182 (+   150us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:41.655712 (+  2530us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:41.655729 (+    17us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:41.655732 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:41.655769 (+    37us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:41.657278 (+  1509us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:41.657795 (+   517us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:41.657799 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:41.657801 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:41.657854 (+    53us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:41.658178 (+   324us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:41.658184 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:41.658187 (+     3us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:41.658432 (+   245us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:41.658624 (+   192us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:41.659127 (+   503us) server_negotiation.cc:300] Negotiation successful
0504 14:07:41.659255 (+   128us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":61}
I20260504 14:07:41.660079 29556 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:41.648465 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35581 (local address 127.25.254.195:37573)
0504 14:07:41.648948 (+   483us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:41.648984 (+    36us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:41.649790 (+   806us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:41.650395 (+   605us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:41.650403 (+     8us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:41.650830 (+   427us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:41.651682 (+   852us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:41.651695 (+    13us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:41.652486 (+   791us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:41.652491 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:41.652913 (+   422us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:41.652920 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:41.653088 (+   168us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:41.653778 (+   690us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:41.653807 (+    29us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:41.655563 (+  1756us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:41.657421 (+  1858us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:41.657428 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:41.657440 (+    12us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:41.657683 (+   243us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:41.657959 (+   276us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:41.657962 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:41.657964 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:41.658067 (+   103us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:41.658622 (+   555us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:41.658630 (+     8us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:41.658874 (+   244us) client_negotiation.cc:770] Sending connection context
0504 14:07:41.659091 (+   217us) client_negotiation.cc:241] Negotiation successful
0504 14:07:41.659341 (+   250us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":313,"thread_start_us":121,"threads_started":1}
I20260504 14:07:41.661250 29554 heartbeater.cc:344] Connected to a master server at 127.25.254.254:35581
I20260504 14:07:41.661532 29554 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:41.662097 29554 heartbeater.cc:507] Master 127.25.254.254:35581 requested a full tablet report, sending...
I20260504 14:07:41.663203 29089 ts_manager.cc:194] Registered new tserver with Master: a510a1a3653e4b9c998d2495ebc23b80 (127.25.254.195:46179)
I20260504 14:07:41.663779 29089 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.195@KRBTEST.COM'} at 127.25.254.195:37573
I20260504 14:07:41.668517 26619 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
W20260504 14:07:41.681550 29128 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:07:41.680012 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:39792 (local address 127.25.254.254:35581)
0504 14:07:41.680231 (+   219us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:41.680236 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:41.680372 (+   136us) server_negotiation.cc:408] Connection header received
0504 14:07:41.680557 (+   185us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:41.680561 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:41.680616 (+    55us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:41.680711 (+    95us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:41.681423 (+   712us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.0.0.1:39792: BlockingRecv error: recv got EOF from 127.0.0.1:39792 (error 108)
Metrics: {"server-negotiator.queue_time_us":74}
I20260504 14:07:41.683110 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 29153
I20260504 14:07:41.689471 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 29289
I20260504 14:07:41.695659 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 29425
I20260504 14:07:41.701819 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 29057
2026-05-04T14:07:41Z chronyd exiting
[       OK ] SecurityITest.TestNoKerberosCredentials (5193 ms)
[ RUN      ] SecurityITest.TestRebalanceReportUnauthorized
Loading random data
Initializing database '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/principal' for realm 'KRBTEST.COM',
master key name 'K/M@KRBTEST.COM'
May 04 14:07:41 dist-test-slave-2x32 krb5kdc[29566](info): setting up network...
krb5kdc: setsockopt(10,IPV6_V6ONLY,1) worked
May 04 14:07:41 dist-test-slave-2x32 krb5kdc[29566](info): set up 2 sockets
May 04 14:07:41 dist-test-slave-2x32 krb5kdc[29566](info): commencing operation
krb5kdc: starting...
W20260504 14:07:43.768625 26619 mini_kdc.cc:121] Time spent starting KDC: real 2.042s	user 0.002s	sys 0.005s
WARNING: no policy specified for test-admin@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-admin@KRBTEST.COM" created.
WARNING: no policy specified for test-user@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-user@KRBTEST.COM" created.
WARNING: no policy specified for joe-interloper@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "joe-interloper@KRBTEST.COM" created.
Authenticating as principal slave/admin@KRBTEST.COM with password.
Entry for principal test-user with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/test-user.keytab.
Entry for principal test-user with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/test-user.keytab.
May 04 14:07:43 dist-test-slave-2x32 krb5kdc[29566](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903663, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
2026-05-04T14:07:43Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:07:43Z Disabled control of system clock
WARNING: no policy specified for kudu/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:43.986239 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:42029
--webserver_interface=127.25.254.254
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:38883
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:42029
--encrypt_data_at_rest=true
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:44.090902 29582 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:44.091161 29582 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:44.091218 29582 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:44.094638 29582 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:07:44.094715 29582 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:44.094743 29582 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:44.094761 29582 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:07:44.094779 29582 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:07:44.099269 29582 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38883
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:42029
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:42029
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.29582
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:44.100363 29582 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:44.101184 29582 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:44.106702 29590 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:44.106774 29588 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:44.106777 29587 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:44.106860 29582 server_base.cc:1061] running on GCE node
I20260504 14:07:44.107507 29582 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:44.108373 29582 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:44.109587 29582 hybrid_clock.cc:648] HybridClock initialized: now 1777903664109571 us; error 30 us; skew 500 ppm
May 04 14:07:44 dist-test-slave-2x32 krb5kdc[29566](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903664, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:44.113253 29582 init.cc:377] Logged in from keytab as kudu/127.25.254.254@KRBTEST.COM (short username kudu)
I20260504 14:07:44.114837 29582 webserver.cc:492] Webserver started at http://127.25.254.254:46797/ using document root <none> and password file <none>
I20260504 14:07:44.115387 29582 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:44.115444 29582 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:44.115612 29582 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:44.117252 29582 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "5ad9d64b6ba04a4bae3342883b9e15df"
format_stamp: "Formatted at 2026-05-04 14:07:44 on dist-test-slave-2x32"
server_key: "ddc1f7b3368eb1786ca28bb66a975748"
server_key_iv: "6823b9fc8940d288937886164d52d738"
server_key_version: "encryptionkey@0"
I20260504 14:07:44.117698 29582 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "5ad9d64b6ba04a4bae3342883b9e15df"
format_stamp: "Formatted at 2026-05-04 14:07:44 on dist-test-slave-2x32"
server_key: "ddc1f7b3368eb1786ca28bb66a975748"
server_key_iv: "6823b9fc8940d288937886164d52d738"
server_key_version: "encryptionkey@0"
I20260504 14:07:44.121208 29582 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.005s	sys 0.000s
I20260504 14:07:44.123521 29597 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:44.124521 29582 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.001s	sys 0.000s
I20260504 14:07:44.124650 29582 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "5ad9d64b6ba04a4bae3342883b9e15df"
format_stamp: "Formatted at 2026-05-04 14:07:44 on dist-test-slave-2x32"
server_key: "ddc1f7b3368eb1786ca28bb66a975748"
server_key_iv: "6823b9fc8940d288937886164d52d738"
server_key_version: "encryptionkey@0"
I20260504 14:07:44.124751 29582 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:44.137012 29582 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:44.143821 29582 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:44.144050 29582 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:44.152261 29582 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:42029
I20260504 14:07:44.152257 29649 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:42029 every 8 connection(s)
I20260504 14:07:44.153321 29582 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:07:44.156492 29650 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:44.163028 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 29582
I20260504 14:07:44.163026 29650 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df: Bootstrap starting.
I20260504 14:07:44.163143 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:07:44.163437 26619 external_mini_cluster.cc:1468] Setting key f7ebdd991ca49b524688a19c40bd7d62
I20260504 14:07:44.165809 29650 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:44.166775 29650 log.cc:826] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:44.169114 29650 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df: No bootstrap required, opened a new log
May 04 14:07:44 dist-test-slave-2x32 krb5kdc[29566](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903663, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:44.172132 29650 raft_consensus.cc:359] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5ad9d64b6ba04a4bae3342883b9e15df" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 42029 } }
I20260504 14:07:44.172422 29650 raft_consensus.cc:385] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:44.172511 29650 raft_consensus.cc:740] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5ad9d64b6ba04a4bae3342883b9e15df, State: Initialized, Role: FOLLOWER
I20260504 14:07:44.173034 29650 consensus_queue.cc:260] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5ad9d64b6ba04a4bae3342883b9e15df" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 42029 } }
I20260504 14:07:44.173195 29650 raft_consensus.cc:399] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:07:44.173274 29650 raft_consensus.cc:493] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:07:44.173391 29650 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:44.174706 29650 raft_consensus.cc:515] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5ad9d64b6ba04a4bae3342883b9e15df" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 42029 } }
I20260504 14:07:44.175108 29650 leader_election.cc:304] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 5ad9d64b6ba04a4bae3342883b9e15df; no voters: 
I20260504 14:07:44.175480 29650 leader_election.cc:290] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:07:44.175890 29655 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:44.176270 29655 raft_consensus.cc:697] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df [term 1 LEADER]: Becoming Leader. State: Replica: 5ad9d64b6ba04a4bae3342883b9e15df, State: Running, Role: LEADER
I20260504 14:07:44.176710 29655 consensus_queue.cc:237] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5ad9d64b6ba04a4bae3342883b9e15df" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 42029 } }
I20260504 14:07:44.176890 29650 sys_catalog.cc:565] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:07:44.178473 29657 sys_catalog.cc:455] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df [sys.catalog]: SysCatalogTable state changed. Reason: New leader 5ad9d64b6ba04a4bae3342883b9e15df. Latest consensus state: current_term: 1 leader_uuid: "5ad9d64b6ba04a4bae3342883b9e15df" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5ad9d64b6ba04a4bae3342883b9e15df" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 42029 } } }
I20260504 14:07:44.178484 29656 sys_catalog.cc:455] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "5ad9d64b6ba04a4bae3342883b9e15df" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5ad9d64b6ba04a4bae3342883b9e15df" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 42029 } } }
I20260504 14:07:44.178704 29657 sys_catalog.cc:458] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:44.178737 29656 sys_catalog.cc:458] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:44.178891 29653 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:44.164871 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:51998 (local address 127.25.254.254:42029)
0504 14:07:44.165338 (+   467us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:44.165350 (+    12us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:44.165391 (+    41us) server_negotiation.cc:408] Connection header received
0504 14:07:44.166203 (+   812us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:44.166233 (+    30us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:44.166690 (+   457us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:44.167088 (+   398us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:44.168208 (+  1120us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:44.169356 (+  1148us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:44.170047 (+   691us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:44.170352 (+   305us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:44.172987 (+  2635us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:44.173016 (+    29us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:44.173033 (+    17us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:44.173076 (+    43us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:44.175426 (+  2350us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:44.176073 (+   647us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:44.176079 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:44.176086 (+     7us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:44.176185 (+    99us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:44.176494 (+   309us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:44.176497 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:44.176499 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:44.176973 (+   474us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:44.177149 (+   176us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:44.177654 (+   505us) server_negotiation.cc:300] Negotiation successful
0504 14:07:44.177942 (+   288us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":302,"thread_start_us":145,"threads_started":1}
W20260504 14:07:44.182070 29670 catalog_manager.cc:1568] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df: loading cluster ID for follower catalog manager: Not found: cluster ID entry not found
W20260504 14:07:44.182148 29670 catalog_manager.cc:883] Not found: cluster ID entry not found: failed to prepare follower catalog manager, will retry
I20260504 14:07:44.182238 29666 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:07:44.182981 29666 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:07:44.188391 29666 catalog_manager.cc:1357] Generated new cluster ID: 8b0c032a9bc9445da5b35cb648ac7104
I20260504 14:07:44.188484 29666 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:07:44.200606 29666 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:07:44.201524 29666 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:07:44.209451 29666 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 5ad9d64b6ba04a4bae3342883b9e15df: Generated new TSK 0
I20260504 14:07:44.210268 29666 catalog_manager.cc:1524] Initializing in-progress tserver states...
WARNING: no policy specified for kudu/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:44.283349 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:0
--local_ip_for_outbound_sockets=127.25.254.193
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:42029
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:38883
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:44.393419 29678 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:44.393689 29678 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:44.393750 29678 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:44.397586 29678 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:44.397673 29678 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:44.397799 29678 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:07:44.402526 29678 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38883
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-0/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:42029
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.29678
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:44.403693 29678 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:44.404610 29678 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:44.411274 29686 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:44.411274 29683 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:44.411514 29684 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:44.411966 29678 server_base.cc:1061] running on GCE node
I20260504 14:07:44.412366 29678 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:44.413012 29678 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:44.414233 29678 hybrid_clock.cc:648] HybridClock initialized: now 1777903664414209 us; error 43 us; skew 500 ppm
May 04 14:07:44 dist-test-slave-2x32 krb5kdc[29566](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903664, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:44.417033 29678 init.cc:377] Logged in from keytab as kudu/127.25.254.193@KRBTEST.COM (short username kudu)
I20260504 14:07:44.418202 29678 webserver.cc:492] Webserver started at http://127.25.254.193:41101/ using document root <none> and password file <none>
I20260504 14:07:44.418771 29678 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:44.418820 29678 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:44.418987 29678 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:44.420799 29678 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-0/data/instance:
uuid: "b98852a563704edca7e675cedf88f4f2"
format_stamp: "Formatted at 2026-05-04 14:07:44 on dist-test-slave-2x32"
server_key: "f512b312029373a035f28246164e6da2"
server_key_iv: "5bcaf7d9959334f433979e09d8730e91"
server_key_version: "encryptionkey@0"
I20260504 14:07:44.421259 29678 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance:
uuid: "b98852a563704edca7e675cedf88f4f2"
format_stamp: "Formatted at 2026-05-04 14:07:44 on dist-test-slave-2x32"
server_key: "f512b312029373a035f28246164e6da2"
server_key_iv: "5bcaf7d9959334f433979e09d8730e91"
server_key_version: "encryptionkey@0"
I20260504 14:07:44.424947 29678 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.006s	sys 0.000s
I20260504 14:07:44.427340 29693 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:44.428467 29678 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.001s	sys 0.000s
I20260504 14:07:44.428705 29678 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "b98852a563704edca7e675cedf88f4f2"
format_stamp: "Formatted at 2026-05-04 14:07:44 on dist-test-slave-2x32"
server_key: "f512b312029373a035f28246164e6da2"
server_key_iv: "5bcaf7d9959334f433979e09d8730e91"
server_key_version: "encryptionkey@0"
I20260504 14:07:44.428915 29678 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:44.447633 29678 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:44.450618 29678 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:44.450845 29678 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:44.451453 29678 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:44.452379 29678 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:44.452428 29678 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:44.452497 29678 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:44.452530 29678 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:44.462091 29678 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:38681
I20260504 14:07:44.462119 29806 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:38681 every 8 connection(s)
I20260504 14:07:44.463125 29678 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
I20260504 14:07:44.470220 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 29678
I20260504 14:07:44.470379 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance
I20260504 14:07:44.470711 26619 external_mini_cluster.cc:1468] Setting key df38993828b9598a1fd8a86c3c644788
May 04 14:07:44 dist-test-slave-2x32 krb5kdc[29566](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903664, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:44.476933 29653 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:44.465047 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:54139 (local address 127.25.254.254:42029)
0504 14:07:44.465260 (+   213us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:44.465264 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:44.465972 (+   708us) server_negotiation.cc:408] Connection header received
0504 14:07:44.466855 (+   883us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:44.466859 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:44.466910 (+    51us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:44.466993 (+    83us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:44.468179 (+  1186us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:44.468704 (+   525us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:44.469389 (+   685us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:44.469536 (+   147us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:44.472660 (+  3124us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:44.472676 (+    16us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:44.472678 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:44.472704 (+    26us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:44.474543 (+  1839us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:44.475135 (+   592us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:44.475142 (+     7us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:44.475147 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:44.475209 (+    62us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:44.475578 (+   369us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:44.475584 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:44.475589 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:44.475823 (+   234us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:44.475948 (+   125us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:44.476485 (+   537us) server_negotiation.cc:300] Negotiation successful
0504 14:07:44.476639 (+   154us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":107}
I20260504 14:07:44.477882 29809 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:44.465376 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:42029 (local address 127.25.254.193:54139)
0504 14:07:44.465819 (+   443us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:44.465852 (+    33us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:44.466638 (+   786us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:44.467140 (+   502us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:44.467148 (+     8us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:44.467532 (+   384us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:44.468015 (+   483us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:44.468026 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:44.468873 (+   847us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:44.468876 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:44.469275 (+   399us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:44.469281 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:44.469455 (+   174us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:44.470798 (+  1343us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:44.470824 (+    26us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:44.472487 (+  1663us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:44.474723 (+  2236us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:44.474731 (+     8us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:44.474744 (+    13us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:44.475018 (+   274us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:44.475353 (+   335us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:44.475356 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:44.475358 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:44.475466 (+   108us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:44.475986 (+   520us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:44.475992 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:44.476224 (+   232us) client_negotiation.cc:770] Sending connection context
0504 14:07:44.476410 (+   186us) client_negotiation.cc:241] Negotiation successful
0504 14:07:44.476795 (+   385us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":257,"thread_start_us":108,"threads_started":1}
I20260504 14:07:44.479287 29807 heartbeater.cc:344] Connected to a master server at 127.25.254.254:42029
I20260504 14:07:44.479636 29807 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:44.480198 29807 heartbeater.cc:507] Master 127.25.254.254:42029 requested a full tablet report, sending...
WARNING: no policy specified for kudu/127.25.254.194@KRBTEST.COM; defaulting to no policy
I20260504 14:07:44.481840 29614 ts_manager.cc:194] Registered new tserver with Master: b98852a563704edca7e675cedf88f4f2 (127.25.254.193:38681)
I20260504 14:07:44.483155 29614 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.193@KRBTEST.COM'} at 127.25.254.193:54139
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:44.530193 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:0
--local_ip_for_outbound_sockets=127.25.254.194
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:42029
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:38883
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:44.634963 29814 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:44.635222 29814 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:44.635285 29814 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:44.638775 29814 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:44.638849 29814 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:44.638932 29814 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:07:44.643527 29814 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38883
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-1/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:42029
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.29814
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:44.644654 29814 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:44.645530 29814 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:44.652801 29822 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:44.652801 29819 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:44.652774 29820 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:44.653340 29814 server_base.cc:1061] running on GCE node
I20260504 14:07:44.653748 29814 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:44.654393 29814 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:44.655640 29814 hybrid_clock.cc:648] HybridClock initialized: now 1777903664655617 us; error 43 us; skew 500 ppm
May 04 14:07:44 dist-test-slave-2x32 krb5kdc[29566](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903664, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:44.658622 29814 init.cc:377] Logged in from keytab as kudu/127.25.254.194@KRBTEST.COM (short username kudu)
I20260504 14:07:44.659757 29814 webserver.cc:492] Webserver started at http://127.25.254.194:37597/ using document root <none> and password file <none>
I20260504 14:07:44.660298 29814 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:44.660344 29814 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:44.660507 29814 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:44.662263 29814 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-1/data/instance:
uuid: "df6d16592bb0415e97430bd96d83179e"
format_stamp: "Formatted at 2026-05-04 14:07:44 on dist-test-slave-2x32"
server_key: "f7e83b08f387954dfda0e192b05f6e76"
server_key_iv: "7ccadf110fd1f27d2a22c476737969c7"
server_key_version: "encryptionkey@0"
I20260504 14:07:44.662729 29814 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance:
uuid: "df6d16592bb0415e97430bd96d83179e"
format_stamp: "Formatted at 2026-05-04 14:07:44 on dist-test-slave-2x32"
server_key: "f7e83b08f387954dfda0e192b05f6e76"
server_key_iv: "7ccadf110fd1f27d2a22c476737969c7"
server_key_version: "encryptionkey@0"
I20260504 14:07:44.666388 29814 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.002s	sys 0.002s
I20260504 14:07:44.669041 29829 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:44.670524 29814 fs_manager.cc:730] Time spent opening block manager: real 0.003s	user 0.003s	sys 0.000s
I20260504 14:07:44.670656 29814 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "df6d16592bb0415e97430bd96d83179e"
format_stamp: "Formatted at 2026-05-04 14:07:44 on dist-test-slave-2x32"
server_key: "f7e83b08f387954dfda0e192b05f6e76"
server_key_iv: "7ccadf110fd1f27d2a22c476737969c7"
server_key_version: "encryptionkey@0"
I20260504 14:07:44.670812 29814 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:44.688863 29814 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:44.692196 29814 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:44.692407 29814 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:44.693030 29814 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:44.694028 29814 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:44.694079 29814 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:44.694146 29814 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:44.694218 29814 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.001s
I20260504 14:07:44.704804 29814 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:37787
I20260504 14:07:44.704823 29942 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:37787 every 8 connection(s)
I20260504 14:07:44.705922 29814 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
I20260504 14:07:44.706090 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 29814
I20260504 14:07:44.706238 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance
I20260504 14:07:44.706574 26619 external_mini_cluster.cc:1468] Setting key ddc21122d9adbf67d78acbb89a75445c
WARNING: no policy specified for kudu/127.25.254.195@KRBTEST.COM; defaulting to no policy
May 04 14:07:44 dist-test-slave-2x32 krb5kdc[29566](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903664, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.195@KRBTEST.COM" created.
I20260504 14:07:44.721310 29653 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:44.708076 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:58515 (local address 127.25.254.254:42029)
0504 14:07:44.708242 (+   166us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:44.708247 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:44.709232 (+   985us) server_negotiation.cc:408] Connection header received
0504 14:07:44.710852 (+  1620us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:44.710857 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:44.710924 (+    67us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:44.711032 (+   108us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:44.712393 (+  1361us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:44.712950 (+   557us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:44.713641 (+   691us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:44.713849 (+   208us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:44.717051 (+  3202us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:44.717074 (+    23us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:44.717076 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:44.717107 (+    31us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:44.718643 (+  1536us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:44.719357 (+   714us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:44.719364 (+     7us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:44.719369 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:44.719517 (+   148us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:44.719934 (+   417us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:44.719937 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:44.719938 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:44.720102 (+   164us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:44.720186 (+    84us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:44.720920 (+   734us) server_negotiation.cc:300] Negotiation successful
0504 14:07:44.721047 (+   127us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":50}
I20260504 14:07:44.722070 29945 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:44.708435 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:42029 (local address 127.25.254.194:58515)
0504 14:07:44.709049 (+   614us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:44.709098 (+    49us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:44.710588 (+  1490us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:44.711245 (+   657us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:44.711255 (+    10us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:44.711703 (+   448us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:44.712229 (+   526us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:44.712241 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:44.713086 (+   845us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:44.713095 (+     9us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:44.713514 (+   419us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:44.713521 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:44.713769 (+   248us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:44.715048 (+  1279us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:44.715070 (+    22us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:44.716872 (+  1802us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:44.718836 (+  1964us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:44.718852 (+    16us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:44.718864 (+    12us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:44.719228 (+   364us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:44.719656 (+   428us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:44.719659 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:44.719661 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:44.719817 (+   156us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:44.720230 (+   413us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:44.720239 (+     9us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:44.720580 (+   341us) client_negotiation.cc:770] Sending connection context
0504 14:07:44.720856 (+   276us) client_negotiation.cc:241] Negotiation successful
0504 14:07:44.721150 (+   294us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":375,"thread_start_us":141,"threads_started":1}
I20260504 14:07:44.723462 29943 heartbeater.cc:344] Connected to a master server at 127.25.254.254:42029
I20260504 14:07:44.723747 29943 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:44.724293 29943 heartbeater.cc:507] Master 127.25.254.254:42029 requested a full tablet report, sending...
I20260504 14:07:44.725499 29614 ts_manager.cc:194] Registered new tserver with Master: df6d16592bb0415e97430bd96d83179e (127.25.254.194:37787)
I20260504 14:07:44.726097 29614 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.194@KRBTEST.COM'} at 127.25.254.194:58515
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:44.766369 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:0
--local_ip_for_outbound_sockets=127.25.254.195
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:42029
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:38883
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:44.873232 29950 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:44.873541 29950 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:44.873636 29950 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:44.877308 29950 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:44.877427 29950 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:44.877542 29950 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:07:44.882190 29950 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38883
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-2/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:42029
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.29950
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:44.883344 29950 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:44.884354 29950 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:44.891243 29956 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:44.891244 29955 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:44.891243 29958 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:44.891474 29950 server_base.cc:1061] running on GCE node
I20260504 14:07:44.892261 29950 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:44.892874 29950 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:44.894058 29950 hybrid_clock.cc:648] HybridClock initialized: now 1777903664894036 us; error 38 us; skew 500 ppm
May 04 14:07:44 dist-test-slave-2x32 krb5kdc[29566](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903664, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:44.897081 29950 init.cc:377] Logged in from keytab as kudu/127.25.254.195@KRBTEST.COM (short username kudu)
I20260504 14:07:44.898192 29950 webserver.cc:492] Webserver started at http://127.25.254.195:41053/ using document root <none> and password file <none>
I20260504 14:07:44.898782 29950 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:44.898854 29950 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:44.899068 29950 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:44.900792 29950 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-2/data/instance:
uuid: "70d1a3358e494e87a86153ade47907f5"
format_stamp: "Formatted at 2026-05-04 14:07:44 on dist-test-slave-2x32"
server_key: "c0311160607fd5a868fc3a4407c46b9c"
server_key_iv: "7dc03397a9ee4b7a3a5009f2b0398f31"
server_key_version: "encryptionkey@0"
I20260504 14:07:44.901284 29950 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance:
uuid: "70d1a3358e494e87a86153ade47907f5"
format_stamp: "Formatted at 2026-05-04 14:07:44 on dist-test-slave-2x32"
server_key: "c0311160607fd5a868fc3a4407c46b9c"
server_key_iv: "7dc03397a9ee4b7a3a5009f2b0398f31"
server_key_version: "encryptionkey@0"
I20260504 14:07:44.904829 29950 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.002s	sys 0.003s
I20260504 14:07:44.907161 29965 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:44.908196 29950 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:07:44.908327 29950 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "70d1a3358e494e87a86153ade47907f5"
format_stamp: "Formatted at 2026-05-04 14:07:44 on dist-test-slave-2x32"
server_key: "c0311160607fd5a868fc3a4407c46b9c"
server_key_iv: "7dc03397a9ee4b7a3a5009f2b0398f31"
server_key_version: "encryptionkey@0"
I20260504 14:07:44.908433 29950 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:44.927381 29950 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:44.930536 29950 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:44.930749 29950 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:44.931327 29950 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:44.932289 29950 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:44.932353 29950 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:44.932417 29950 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:44.932585 29950 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:44.942624 29950 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:42813
I20260504 14:07:44.942639 30078 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:42813 every 8 connection(s)
I20260504 14:07:44.943635 29950 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
May 04 14:07:44 dist-test-slave-2x32 krb5kdc[29566](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903664, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:44.952643 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 29950
I20260504 14:07:44.952801 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRebalanceReportUnauthorized.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance
I20260504 14:07:44.953080 26619 external_mini_cluster.cc:1468] Setting key ea1b3b4a4a55ff8242d6106e2dee41b6
I20260504 14:07:44.956907 29653 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:44.945441 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:38327 (local address 127.25.254.254:42029)
0504 14:07:44.945664 (+   223us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:44.945668 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:44.946432 (+   764us) server_negotiation.cc:408] Connection header received
0504 14:07:44.947395 (+   963us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:44.947399 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:44.947460 (+    61us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:44.947567 (+   107us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:44.948753 (+  1186us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:44.949258 (+   505us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:44.949874 (+   616us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:44.950053 (+   179us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:44.952728 (+  2675us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:44.952767 (+    39us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:44.952770 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:44.952801 (+    31us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:44.954446 (+  1645us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:44.955065 (+   619us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:44.955068 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:44.955070 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:44.955115 (+    45us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:44.955480 (+   365us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:44.955483 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:44.955485 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:44.955813 (+   328us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:44.955932 (+   119us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:44.956638 (+   706us) server_negotiation.cc:300] Negotiation successful
0504 14:07:44.956773 (+   135us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":121}
I20260504 14:07:44.957640 30081 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:44.945730 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:42029 (local address 127.25.254.195:38327)
0504 14:07:44.946244 (+   514us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:44.946308 (+    64us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:44.947189 (+   881us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:44.947735 (+   546us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:44.947743 (+     8us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:44.948137 (+   394us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:44.948602 (+   465us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:44.948614 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:44.949375 (+   761us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:44.949383 (+     8us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:44.949759 (+   376us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:44.949765 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:44.949980 (+   215us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:44.951081 (+  1101us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:44.951099 (+    18us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:44.952567 (+  1468us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:44.954619 (+  2052us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:44.954625 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:44.954639 (+    14us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:44.954969 (+   330us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:44.955236 (+   267us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:44.955241 (+     5us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:44.955243 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:44.955382 (+   139us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:44.955962 (+   580us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:44.955968 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:44.956209 (+   241us) client_negotiation.cc:770] Sending connection context
0504 14:07:44.956419 (+   210us) client_negotiation.cc:241] Negotiation successful
0504 14:07:44.956714 (+   295us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":274,"thread_start_us":110,"threads_started":1}
I20260504 14:07:44.958861 30079 heartbeater.cc:344] Connected to a master server at 127.25.254.254:42029
I20260504 14:07:44.959117 30079 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:44.959632 30079 heartbeater.cc:507] Master 127.25.254.254:42029 requested a full tablet report, sending...
I20260504 14:07:44.960711 29614 ts_manager.cc:194] Registered new tserver with Master: 70d1a3358e494e87a86153ade47907f5 (127.25.254.195:42813)
I20260504 14:07:44.961222 29614 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.195@KRBTEST.COM'} at 127.25.254.195:38327
I20260504 14:07:44.967128 26619 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
W20260504 14:07:45.098271 29653 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:07:45.094576 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:52006 (local address 127.25.254.254:42029)
0504 14:07:45.094723 (+   147us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:45.094727 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:45.095541 (+   814us) server_negotiation.cc:408] Connection header received
0504 14:07:45.096466 (+   925us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:45.096470 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:45.096525 (+    55us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:45.096606 (+    81us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:45.098088 (+  1482us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.0.0.1:52006: BlockingRecv error: recv got EOF from 127.0.0.1:52006 (error 108)
Metrics: {"server-negotiator.queue_time_us":47}
W20260504 14:07:45.100219 29653 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:07:45.098881 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:52010 (local address 127.25.254.254:42029)
0504 14:07:45.099049 (+   168us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:45.099053 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:45.099067 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:07:45.099327 (+   260us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:45.099331 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:45.099541 (+   210us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:45.099666 (+   125us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:45.100085 (+   419us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.0.0.1:52010: BlockingRecv error: recv got EOF from 127.0.0.1:52010 (error 108)
Metrics: {"server-negotiator.queue_time_us":71}
W20260504 14:07:45.101366 29653 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:07:45.100613 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:52014 (local address 127.25.254.254:42029)
0504 14:07:45.100764 (+   151us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:45.100767 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:45.100780 (+    13us) server_negotiation.cc:408] Connection header received
0504 14:07:45.100835 (+    55us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:45.100838 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:45.100873 (+    35us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:45.100943 (+    70us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:45.101277 (+   334us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.0.0.1:52014: BlockingRecv error: recv got EOF from 127.0.0.1:52014 (error 108)
Metrics: {"server-negotiator.queue_time_us":77}
W20260504 14:07:45.105707 29653 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:07:45.104786 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:52016 (local address 127.25.254.254:42029)
0504 14:07:45.104929 (+   143us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:45.104933 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:45.104972 (+    39us) server_negotiation.cc:408] Connection header received
0504 14:07:45.105015 (+    43us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:45.105018 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:45.105057 (+    39us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:45.105127 (+    70us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:45.105580 (+   453us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.0.0.1:52016: BlockingRecv error: recv got EOF from 127.0.0.1:52016 (error 108)
Metrics: {"server-negotiator.queue_time_us":63}
I20260504 14:07:45.114118 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 29678
I20260504 14:07:45.121457 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 29814
I20260504 14:07:45.127563 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 29950
I20260504 14:07:45.133286 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 29582
2026-05-04T14:07:45Z chronyd exiting
[       OK ] SecurityITest.TestRebalanceReportUnauthorized (3429 ms)
[ RUN      ] SecurityITest.SaslPlainFallback
Loading random data
Initializing database '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/krb5kdc/principal' for realm 'KRBTEST.COM',
master key name 'K/M@KRBTEST.COM'
May 04 14:07:45 dist-test-slave-2x32 krb5kdc[30102](info): setting up network...
krb5kdc: setsockopt(10,IPV6_V6ONLY,1) worked
May 04 14:07:45 dist-test-slave-2x32 krb5kdc[30102](info): set up 2 sockets
May 04 14:07:45 dist-test-slave-2x32 krb5kdc[30102](info): commencing operation
krb5kdc: starting...
W20260504 14:07:47.162648 26619 mini_kdc.cc:121] Time spent starting KDC: real 2.007s	user 0.001s	sys 0.005s
WARNING: no policy specified for test-admin@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-admin@KRBTEST.COM" created.
WARNING: no policy specified for test-user@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-user@KRBTEST.COM" created.
WARNING: no policy specified for joe-interloper@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "joe-interloper@KRBTEST.COM" created.
Authenticating as principal slave/admin@KRBTEST.COM with password.
Entry for principal test-user with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/krb5kdc/test-user.keytab.
Entry for principal test-user with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/krb5kdc/test-user.keytab.
May 04 14:07:47 dist-test-slave-2x32 krb5kdc[30102](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903667, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
2026-05-04T14:07:47Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:07:47Z Disabled control of system clock
WARNING: no policy specified for kudu/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:47.325434 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:45109
--webserver_interface=127.25.254.254
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:34157
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:45109
--encrypt_data_at_rest=true
--rpc_trace_negotiation
--rpc-authentication=optional
--user-acl=* with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:47.431352 30118 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:47.431607 30118 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:47.431658 30118 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:47.435092 30118 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:07:47.435158 30118 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:47.435181 30118 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:47.435201 30118 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:07:47.435220 30118 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:07:47.439581 30118 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:34157
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:45109
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:45109
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--superuser_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.30118
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:47.440729 30118 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:47.441604 30118 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:47.448202 30126 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:47.448196 30123 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:47.448199 30124 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:47.448832 30118 server_base.cc:1061] running on GCE node
I20260504 14:07:47.449474 30118 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:47.450572 30118 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:47.451853 30118 hybrid_clock.cc:648] HybridClock initialized: now 1777903667451822 us; error 49 us; skew 500 ppm
May 04 14:07:47 dist-test-slave-2x32 krb5kdc[30102](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903667, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:47.455015 30118 init.cc:377] Logged in from keytab as kudu/127.25.254.254@KRBTEST.COM (short username kudu)
I20260504 14:07:47.456496 30118 webserver.cc:492] Webserver started at http://127.25.254.254:34483/ using document root <none> and password file <none>
I20260504 14:07:47.457172 30118 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:47.457249 30118 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:47.457468 30118 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:47.459523 30118 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "7d794633ffdc47e681f01bf7ade092a9"
format_stamp: "Formatted at 2026-05-04 14:07:47 on dist-test-slave-2x32"
server_key: "a323a572f59a016d585bedd567fc223d"
server_key_iv: "56dbbb6bb7a280c62dd4e7f6e4d63708"
server_key_version: "encryptionkey@0"
I20260504 14:07:47.460130 30118 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "7d794633ffdc47e681f01bf7ade092a9"
format_stamp: "Formatted at 2026-05-04 14:07:47 on dist-test-slave-2x32"
server_key: "a323a572f59a016d585bedd567fc223d"
server_key_iv: "56dbbb6bb7a280c62dd4e7f6e4d63708"
server_key_version: "encryptionkey@0"
I20260504 14:07:47.463816 30118 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.001s	sys 0.004s
I20260504 14:07:47.466401 30133 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:47.467609 30118 fs_manager.cc:730] Time spent opening block manager: real 0.003s	user 0.002s	sys 0.001s
I20260504 14:07:47.467772 30118 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "7d794633ffdc47e681f01bf7ade092a9"
format_stamp: "Formatted at 2026-05-04 14:07:47 on dist-test-slave-2x32"
server_key: "a323a572f59a016d585bedd567fc223d"
server_key_iv: "56dbbb6bb7a280c62dd4e7f6e4d63708"
server_key_version: "encryptionkey@0"
I20260504 14:07:47.467887 30118 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:47.491997 30118 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:47.495298 30118 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:47.495551 30118 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:47.504633 30118 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:45109
I20260504 14:07:47.504627 30185 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:45109 every 8 connection(s)
I20260504 14:07:47.505746 30118 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:07:47.508639 30186 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:47.511977 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 30118
I20260504 14:07:47.512099 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.SaslPlainFallback.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:07:47.512367 26619 external_mini_cluster.cc:1468] Setting key 89098f58dfb02b477271c7ff4dd60817
I20260504 14:07:47.514891 30186 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9: Bootstrap starting.
I20260504 14:07:47.517475 30186 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:47.518357 30186 log.cc:826] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:47.520921 30186 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9: No bootstrap required, opened a new log
May 04 14:07:47 dist-test-slave-2x32 krb5kdc[30102](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903667, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:47.524430 30186 raft_consensus.cc:359] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7d794633ffdc47e681f01bf7ade092a9" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 45109 } }
I20260504 14:07:47.524705 30186 raft_consensus.cc:385] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:47.524829 30186 raft_consensus.cc:740] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7d794633ffdc47e681f01bf7ade092a9, State: Initialized, Role: FOLLOWER
I20260504 14:07:47.525364 30186 consensus_queue.cc:260] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7d794633ffdc47e681f01bf7ade092a9" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 45109 } }
I20260504 14:07:47.525552 30186 raft_consensus.cc:399] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:07:47.525650 30186 raft_consensus.cc:493] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:07:47.525748 30186 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:47.526932 30186 raft_consensus.cc:515] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7d794633ffdc47e681f01bf7ade092a9" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 45109 } }
I20260504 14:07:47.527362 30189 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:47.513789 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:58060 (local address 127.25.254.254:45109)
0504 14:07:47.514280 (+   491us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:47.514293 (+    13us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:47.514332 (+    39us) server_negotiation.cc:408] Connection header received
0504 14:07:47.515038 (+   706us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:47.515062 (+    24us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:47.515422 (+   360us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:47.515784 (+   362us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:47.516646 (+   862us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:47.517799 (+  1153us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:47.518674 (+   875us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:47.518991 (+   317us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:47.521724 (+  2733us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:47.521751 (+    27us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:47.521766 (+    15us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:47.521801 (+    35us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:47.524079 (+  2278us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:47.524557 (+   478us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:47.524564 (+     7us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:47.524570 (+     6us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:47.524667 (+    97us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:47.525000 (+   333us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:47.525003 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:47.525005 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:47.525428 (+   423us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:47.525629 (+   201us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:47.525970 (+   341us) server_negotiation.cc:300] Negotiation successful
0504 14:07:47.526240 (+   270us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":309,"thread_start_us":153,"threads_started":1}
I20260504 14:07:47.527361 30186 leader_election.cc:304] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 7d794633ffdc47e681f01bf7ade092a9; no voters: 
I20260504 14:07:47.527763 30186 leader_election.cc:290] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:07:47.527905 30191 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:47.528136 30191 raft_consensus.cc:697] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9 [term 1 LEADER]: Becoming Leader. State: Replica: 7d794633ffdc47e681f01bf7ade092a9, State: Running, Role: LEADER
I20260504 14:07:47.528549 30191 consensus_queue.cc:237] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7d794633ffdc47e681f01bf7ade092a9" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 45109 } }
I20260504 14:07:47.529006 30186 sys_catalog.cc:565] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9 [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:07:47.529704 30193 sys_catalog.cc:455] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 7d794633ffdc47e681f01bf7ade092a9. Latest consensus state: current_term: 1 leader_uuid: "7d794633ffdc47e681f01bf7ade092a9" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7d794633ffdc47e681f01bf7ade092a9" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 45109 } } }
I20260504 14:07:47.530066 30193 sys_catalog.cc:458] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9 [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:47.530277 30192 sys_catalog.cc:455] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "7d794633ffdc47e681f01bf7ade092a9" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7d794633ffdc47e681f01bf7ade092a9" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 45109 } } }
I20260504 14:07:47.530376 30192 sys_catalog.cc:458] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9 [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:47.530735 30200 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:07:47.530746 26619 external_mini_cluster.cc:949] 0 TS(s) registered with all masters
I20260504 14:07:47.534480 30200 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:07:47.547791 30200 catalog_manager.cc:1357] Generated new cluster ID: 3e2b4a67a74f4a739439b00fc590c407
I20260504 14:07:47.547879 30200 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:07:47.552141 30189 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:47.547104 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:58062 (local address 127.25.254.254:45109)
0504 14:07:47.547243 (+   139us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:47.547248 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:47.547491 (+   243us) server_negotiation.cc:408] Connection header received
0504 14:07:47.547658 (+   167us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:47.547664 (+     6us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:47.547727 (+    63us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:47.547800 (+    73us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:47.548673 (+   873us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:47.549407 (+   734us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:47.550335 (+   928us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:47.550590 (+   255us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:47.551048 (+   458us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:47.551067 (+    19us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:47.551071 (+     4us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:07:47.551652 (+   581us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:47.551689 (+    37us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:07:47.551709 (+    20us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:47.551818 (+   109us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:47.551915 (+    97us) server_negotiation.cc:300] Negotiation successful
0504 14:07:47.551979 (+    64us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":53}
I20260504 14:07:47.566473 30200 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:07:47.567871 30200 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:07:47.574262 30200 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 7d794633ffdc47e681f01bf7ade092a9: Generated new TSK 0
I20260504 14:07:47.574878 30200 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20260504 14:07:47.585131 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 30118
2026-05-04T14:07:47Z chronyd exiting
[       OK ] SecurityITest.SaslPlainFallback (2449 ms)
[ RUN      ] SecurityITest.TestUnauthorizedClientKerberosCredentials
Loading random data
Initializing database '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/principal' for realm 'KRBTEST.COM',
master key name 'K/M@KRBTEST.COM'
May 04 14:07:47 dist-test-slave-2x32 krb5kdc[30219](info): setting up network...
krb5kdc: setsockopt(10,IPV6_V6ONLY,1) worked
May 04 14:07:47 dist-test-slave-2x32 krb5kdc[30219](info): set up 2 sockets
May 04 14:07:47 dist-test-slave-2x32 krb5kdc[30219](info): commencing operation
krb5kdc: starting...
W20260504 14:07:49.598981 26619 mini_kdc.cc:121] Time spent starting KDC: real 1.993s	user 0.000s	sys 0.006s
WARNING: no policy specified for test-admin@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-admin@KRBTEST.COM" created.
WARNING: no policy specified for test-user@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-user@KRBTEST.COM" created.
WARNING: no policy specified for joe-interloper@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "joe-interloper@KRBTEST.COM" created.
Authenticating as principal slave/admin@KRBTEST.COM with password.
Entry for principal test-user with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/test-user.keytab.
Entry for principal test-user with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/test-user.keytab.
May 04 14:07:49 dist-test-slave-2x32 krb5kdc[30219](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903669, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
2026-05-04T14:07:49Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:07:49Z Disabled control of system clock
WARNING: no policy specified for kudu/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:49.751703 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:42507
--webserver_interface=127.25.254.254
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:38391
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:42507
--encrypt_data_at_rest=true
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:49.855199 30235 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:49.855465 30235 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:49.855520 30235 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:49.859019 30235 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:07:49.859091 30235 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:49.859115 30235 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:49.859135 30235 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:07:49.859153 30235 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:07:49.863672 30235 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38391
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:42507
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:42507
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.30235
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:49.864704 30235 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:49.865491 30235 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:49.870915 30243 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:49.870935 30240 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:49.870919 30241 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:49.871243 30235 server_base.cc:1061] running on GCE node
I20260504 14:07:49.871781 30235 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:49.872733 30235 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:49.873920 30235 hybrid_clock.cc:648] HybridClock initialized: now 1777903669873898 us; error 35 us; skew 500 ppm
May 04 14:07:49 dist-test-slave-2x32 krb5kdc[30219](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903669, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:49.876904 30235 init.cc:377] Logged in from keytab as kudu/127.25.254.254@KRBTEST.COM (short username kudu)
I20260504 14:07:49.878023 30235 webserver.cc:492] Webserver started at http://127.25.254.254:35277/ using document root <none> and password file <none>
I20260504 14:07:49.878726 30235 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:49.878803 30235 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:49.879020 30235 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:49.880672 30235 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "340e18abdb014080ba3aa691579bfde0"
format_stamp: "Formatted at 2026-05-04 14:07:49 on dist-test-slave-2x32"
server_key: "a58aa32e9c5753efc4682f6d8f5bb2b0"
server_key_iv: "276046a33606a1b70edbf87ac2880b43"
server_key_version: "encryptionkey@0"
I20260504 14:07:49.881146 30235 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "340e18abdb014080ba3aa691579bfde0"
format_stamp: "Formatted at 2026-05-04 14:07:49 on dist-test-slave-2x32"
server_key: "a58aa32e9c5753efc4682f6d8f5bb2b0"
server_key_iv: "276046a33606a1b70edbf87ac2880b43"
server_key_version: "encryptionkey@0"
I20260504 14:07:49.884601 30235 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.002s	sys 0.003s
I20260504 14:07:49.886894 30250 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:49.887954 30235 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.001s
I20260504 14:07:49.888096 30235 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "340e18abdb014080ba3aa691579bfde0"
format_stamp: "Formatted at 2026-05-04 14:07:49 on dist-test-slave-2x32"
server_key: "a58aa32e9c5753efc4682f6d8f5bb2b0"
server_key_iv: "276046a33606a1b70edbf87ac2880b43"
server_key_version: "encryptionkey@0"
I20260504 14:07:49.888206 30235 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:49.907615 30235 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:49.916666 30235 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:49.916899 30235 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:49.924880 30302 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:42507 every 8 connection(s)
I20260504 14:07:49.924878 30235 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:42507
I20260504 14:07:49.926214 30235 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:07:49.927989 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 30235
I20260504 14:07:49.928138 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:07:49.928367 26619 external_mini_cluster.cc:1468] Setting key 8fa08904b67d79c5ee420547a571989a
I20260504 14:07:49.929574 30303 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:49.936466 30303 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0: Bootstrap starting.
May 04 14:07:49 dist-test-slave-2x32 krb5kdc[30219](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903669, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:49.939365 30303 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:49.940333 30303 log.cc:826] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:49.942814 30306 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:49.929625 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:34256 (local address 127.25.254.254:42507)
0504 14:07:49.930067 (+   442us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:49.930079 (+    12us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:49.930121 (+    42us) server_negotiation.cc:408] Connection header received
0504 14:07:49.931010 (+   889us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:49.931037 (+    27us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:49.931401 (+   364us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:49.931791 (+   390us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:49.932534 (+   743us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:49.933864 (+  1330us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:49.934604 (+   740us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:49.934891 (+   287us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:49.937122 (+  2231us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:49.937186 (+    64us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:49.937202 (+    16us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:49.937237 (+    35us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:49.939735 (+  2498us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:49.940196 (+   461us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:49.940206 (+    10us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:49.940215 (+     9us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:49.940315 (+   100us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:49.940644 (+   329us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:49.940648 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:49.940650 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:49.941074 (+   424us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:49.941230 (+   156us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:49.941745 (+   515us) server_negotiation.cc:300] Negotiation successful
0504 14:07:49.942003 (+   258us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":290,"thread_start_us":108,"threads_started":1}
I20260504 14:07:49.942996 30303 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0: No bootstrap required, opened a new log
I20260504 14:07:49.945854 30303 raft_consensus.cc:359] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "340e18abdb014080ba3aa691579bfde0" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 42507 } }
I20260504 14:07:49.946053 30303 raft_consensus.cc:385] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:49.946142 30303 raft_consensus.cc:740] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 340e18abdb014080ba3aa691579bfde0, State: Initialized, Role: FOLLOWER
I20260504 14:07:49.946638 30303 consensus_queue.cc:260] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "340e18abdb014080ba3aa691579bfde0" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 42507 } }
I20260504 14:07:49.946763 30303 raft_consensus.cc:399] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:07:49.946810 30303 raft_consensus.cc:493] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:07:49.946923 30303 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:49.947846 30303 raft_consensus.cc:515] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "340e18abdb014080ba3aa691579bfde0" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 42507 } }
I20260504 14:07:49.948191 30303 leader_election.cc:304] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 340e18abdb014080ba3aa691579bfde0; no voters: 
I20260504 14:07:49.948489 30303 leader_election.cc:290] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:07:49.948683 30308 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:49.948911 30308 raft_consensus.cc:697] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0 [term 1 LEADER]: Becoming Leader. State: Replica: 340e18abdb014080ba3aa691579bfde0, State: Running, Role: LEADER
I20260504 14:07:49.949194 30308 consensus_queue.cc:237] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "340e18abdb014080ba3aa691579bfde0" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 42507 } }
I20260504 14:07:49.949656 30303 sys_catalog.cc:565] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0 [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:07:49.950722 30310 sys_catalog.cc:455] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 340e18abdb014080ba3aa691579bfde0. Latest consensus state: current_term: 1 leader_uuid: "340e18abdb014080ba3aa691579bfde0" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "340e18abdb014080ba3aa691579bfde0" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 42507 } } }
I20260504 14:07:49.950834 30310 sys_catalog.cc:458] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0 [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:49.951349 30309 sys_catalog.cc:455] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "340e18abdb014080ba3aa691579bfde0" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "340e18abdb014080ba3aa691579bfde0" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 42507 } } }
I20260504 14:07:49.951442 30309 sys_catalog.cc:458] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0 [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:49.952207 30315 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:07:49.955330 30315 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:07:49.960858 30315 catalog_manager.cc:1357] Generated new cluster ID: 1c612e51cc1149c083ea3638d8516c4d
I20260504 14:07:49.960940 30315 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:07:49.999089 30315 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:07:50.000293 30315 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:07:50.013829 30315 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 340e18abdb014080ba3aa691579bfde0: Generated new TSK 0
I20260504 14:07:50.014714 30315 catalog_manager.cc:1524] Initializing in-progress tserver states...
WARNING: no policy specified for kudu/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:50.083482 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:0
--local_ip_for_outbound_sockets=127.25.254.193
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:42507
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:38391
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:50.191062 30331 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:50.191315 30331 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:50.191416 30331 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:50.194933 30331 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:50.195005 30331 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:50.195087 30331 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:07:50.199550 30331 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38391
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:42507
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.30331
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:50.200623 30331 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:50.201475 30331 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:50.207803 30339 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:50.207840 30336 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:50.207859 30337 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:50.208706 30331 server_base.cc:1061] running on GCE node
I20260504 14:07:50.209141 30331 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:50.209743 30331 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:50.210927 30331 hybrid_clock.cc:648] HybridClock initialized: now 1777903670210915 us; error 26 us; skew 500 ppm
May 04 14:07:50 dist-test-slave-2x32 krb5kdc[30219](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903670, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:50.213800 30331 init.cc:377] Logged in from keytab as kudu/127.25.254.193@KRBTEST.COM (short username kudu)
I20260504 14:07:50.214941 30331 webserver.cc:492] Webserver started at http://127.25.254.193:36863/ using document root <none> and password file <none>
I20260504 14:07:50.215534 30331 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:50.215605 30331 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:50.215826 30331 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:50.217531 30331 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/data/instance:
uuid: "ec6f36cb416c4b80bc717a7822dda52c"
format_stamp: "Formatted at 2026-05-04 14:07:50 on dist-test-slave-2x32"
server_key: "9a281c29e12452ba7de91664de1177ca"
server_key_iv: "8eb7bc22b3139532310e12afe2413433"
server_key_version: "encryptionkey@0"
I20260504 14:07:50.218041 30331 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance:
uuid: "ec6f36cb416c4b80bc717a7822dda52c"
format_stamp: "Formatted at 2026-05-04 14:07:50 on dist-test-slave-2x32"
server_key: "9a281c29e12452ba7de91664de1177ca"
server_key_iv: "8eb7bc22b3139532310e12afe2413433"
server_key_version: "encryptionkey@0"
I20260504 14:07:50.221490 30331 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:07:50.223948 30346 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:50.225010 30331 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.000s	sys 0.001s
I20260504 14:07:50.225139 30331 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "ec6f36cb416c4b80bc717a7822dda52c"
format_stamp: "Formatted at 2026-05-04 14:07:50 on dist-test-slave-2x32"
server_key: "9a281c29e12452ba7de91664de1177ca"
server_key_iv: "8eb7bc22b3139532310e12afe2413433"
server_key_version: "encryptionkey@0"
I20260504 14:07:50.225244 30331 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:50.238354 30331 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:50.241313 30331 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:50.241528 30331 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:50.242112 30331 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:50.243083 30331 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:50.243153 30331 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:50.243261 30331 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:50.243316 30331 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:50.252835 30331 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:36713
I20260504 14:07:50.252858 30459 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:36713 every 8 connection(s)
I20260504 14:07:50.253777 30331 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
I20260504 14:07:50.259802 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 30331
I20260504 14:07:50.259968 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance
I20260504 14:07:50.260259 26619 external_mini_cluster.cc:1468] Setting key b0023603cb0e789057c33c4ef43b5de0
May 04 14:07:50 dist-test-slave-2x32 krb5kdc[30219](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903670, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:50.267985 30306 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:50.255578 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:39845 (local address 127.25.254.254:42507)
0504 14:07:50.255715 (+   137us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:50.255719 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:50.256432 (+   713us) server_negotiation.cc:408] Connection header received
0504 14:07:50.257245 (+   813us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:50.257249 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:50.257298 (+    49us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:50.257382 (+    84us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:50.258633 (+  1251us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:50.259175 (+   542us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:50.259897 (+   722us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:50.260117 (+   220us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:50.263602 (+  3485us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:50.263628 (+    26us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:50.263634 (+     6us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:50.263673 (+    39us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:50.265679 (+  2006us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:50.266244 (+   565us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:50.266251 (+     7us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:50.266256 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:50.266317 (+    61us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:50.266709 (+   392us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:50.266715 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:50.266719 (+     4us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:50.266939 (+   220us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:50.267059 (+   120us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:50.267628 (+   569us) server_negotiation.cc:300] Negotiation successful
0504 14:07:50.267774 (+   146us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":47}
I20260504 14:07:50.268518 30462 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:50.255881 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:42507 (local address 127.25.254.193:39845)
0504 14:07:50.256281 (+   400us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:50.256314 (+    33us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:50.257049 (+   735us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:50.257524 (+   475us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:50.257532 (+     8us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:50.257938 (+   406us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:50.258449 (+   511us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:50.258459 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:50.259315 (+   856us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:50.259318 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:50.259761 (+   443us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:50.259768 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:50.259965 (+   197us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:50.261312 (+  1347us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:50.261331 (+    19us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:50.263349 (+  2018us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:50.265822 (+  2473us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:50.265828 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:50.265837 (+     9us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:50.266078 (+   241us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:50.266443 (+   365us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:50.266446 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:50.266448 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:50.266591 (+   143us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:50.267075 (+   484us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:50.267080 (+     5us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:50.267345 (+   265us) client_negotiation.cc:770] Sending connection context
0504 14:07:50.267533 (+   188us) client_negotiation.cc:241] Negotiation successful
0504 14:07:50.267743 (+   210us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":239,"thread_start_us":104,"threads_started":1}
WARNING: no policy specified for kudu/127.25.254.194@KRBTEST.COM; defaulting to no policy
I20260504 14:07:50.269871 30460 heartbeater.cc:344] Connected to a master server at 127.25.254.254:42507
I20260504 14:07:50.270140 30460 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:50.270727 30460 heartbeater.cc:507] Master 127.25.254.254:42507 requested a full tablet report, sending...
I20260504 14:07:50.272352 30267 ts_manager.cc:194] Registered new tserver with Master: ec6f36cb416c4b80bc717a7822dda52c (127.25.254.193:36713)
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.194@KRBTEST.COM" created.
I20260504 14:07:50.273770 30267 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.193@KRBTEST.COM'} at 127.25.254.193:39845
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:50.317145 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:0
--local_ip_for_outbound_sockets=127.25.254.194
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:42507
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:38391
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:50.421980 30467 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:50.422317 30467 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:50.422386 30467 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:50.425899 30467 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:50.425977 30467 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:50.426060 30467 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:07:50.430649 30467 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38391
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:42507
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.30467
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:50.431700 30467 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:50.432592 30467 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:50.439803 30472 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:50.439803 30473 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:50.439910 30475 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:50.440021 30467 server_base.cc:1061] running on GCE node
I20260504 14:07:50.440447 30467 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:50.441195 30467 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:50.442361 30467 hybrid_clock.cc:648] HybridClock initialized: now 1777903670442295 us; error 87 us; skew 500 ppm
May 04 14:07:50 dist-test-slave-2x32 krb5kdc[30219](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903670, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:50.445675 30467 init.cc:377] Logged in from keytab as kudu/127.25.254.194@KRBTEST.COM (short username kudu)
I20260504 14:07:50.446961 30467 webserver.cc:492] Webserver started at http://127.25.254.194:38217/ using document root <none> and password file <none>
I20260504 14:07:50.447561 30467 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:50.447634 30467 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:50.447857 30467 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:50.449661 30467 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/data/instance:
uuid: "d1303e42ae7745b5bf9bfe3942a4719c"
format_stamp: "Formatted at 2026-05-04 14:07:50 on dist-test-slave-2x32"
server_key: "21d34237c2df4c6f263af55988f6e827"
server_key_iv: "351346f02aa0b443bc5166ff0f1f6672"
server_key_version: "encryptionkey@0"
I20260504 14:07:50.450246 30467 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance:
uuid: "d1303e42ae7745b5bf9bfe3942a4719c"
format_stamp: "Formatted at 2026-05-04 14:07:50 on dist-test-slave-2x32"
server_key: "21d34237c2df4c6f263af55988f6e827"
server_key_iv: "351346f02aa0b443bc5166ff0f1f6672"
server_key_version: "encryptionkey@0"
I20260504 14:07:50.453944 30467 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.003s	sys 0.000s
I20260504 14:07:50.456310 30482 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:50.457419 30467 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:07:50.457567 30467 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "d1303e42ae7745b5bf9bfe3942a4719c"
format_stamp: "Formatted at 2026-05-04 14:07:50 on dist-test-slave-2x32"
server_key: "21d34237c2df4c6f263af55988f6e827"
server_key_iv: "351346f02aa0b443bc5166ff0f1f6672"
server_key_version: "encryptionkey@0"
I20260504 14:07:50.457679 30467 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:50.471998 30467 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:50.474957 30467 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:50.475159 30467 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:50.475731 30467 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:50.476632 30467 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:50.476702 30467 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:50.476768 30467 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:50.476815 30467 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:50.486986 30467 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:43549
I20260504 14:07:50.486999 30595 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:43549 every 8 connection(s)
I20260504 14:07:50.487970 30467 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
I20260504 14:07:50.493379 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 30467
I20260504 14:07:50.493497 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance
I20260504 14:07:50.493767 26619 external_mini_cluster.cc:1468] Setting key 0bf9681de8f566450c10df73a2dcc20d
May 04 14:07:50 dist-test-slave-2x32 krb5kdc[30219](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903670, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
WARNING: no policy specified for kudu/127.25.254.195@KRBTEST.COM; defaulting to no policy
I20260504 14:07:50.506551 30306 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:50.489781 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:37187 (local address 127.25.254.254:42507)
0504 14:07:50.489942 (+   161us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:50.489946 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:50.490708 (+   762us) server_negotiation.cc:408] Connection header received
0504 14:07:50.491614 (+   906us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:50.491617 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:50.491666 (+    49us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:50.491735 (+    69us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:50.492898 (+  1163us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:50.493617 (+   719us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:50.495083 (+  1466us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:50.495842 (+   759us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:50.501346 (+  5504us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:50.501380 (+    34us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:50.501388 (+     8us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:50.501441 (+    53us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:50.503355 (+  1914us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:50.504258 (+   903us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:50.504265 (+     7us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:50.504270 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:50.504335 (+    65us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:50.504796 (+   461us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:50.504799 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:50.504801 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:50.505208 (+   407us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:50.505289 (+    81us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:50.506129 (+   840us) server_negotiation.cc:300] Negotiation successful
0504 14:07:50.506296 (+   167us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":62}
I20260504 14:07:50.507769 30598 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:50.490052 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:42507 (local address 127.25.254.194:37187)
0504 14:07:50.490553 (+   501us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:50.490590 (+    37us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:50.491355 (+   765us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:50.491877 (+   522us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:50.491885 (+     8us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:50.492277 (+   392us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:50.492750 (+   473us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:50.492761 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:50.493771 (+  1010us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:50.493774 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:50.494735 (+   961us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:50.494747 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:50.496107 (+  1360us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:50.497898 (+  1791us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:50.497943 (+    45us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:50.500969 (+  3026us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:50.503690 (+  2721us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:50.503698 (+     8us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:50.503712 (+    14us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:50.504103 (+   391us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:50.504498 (+   395us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:50.504504 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:50.504509 (+     5us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:50.504669 (+   160us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:50.505381 (+   712us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:50.505387 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:50.505790 (+   403us) client_negotiation.cc:770] Sending connection context
0504 14:07:50.506856 (+  1066us) client_negotiation.cc:241] Negotiation successful
0504 14:07:50.507079 (+   223us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":325,"thread_start_us":156,"threads_started":1}
I20260504 14:07:50.509162 30596 heartbeater.cc:344] Connected to a master server at 127.25.254.254:42507
I20260504 14:07:50.509565 30596 heartbeater.cc:461] Registering TS with master...
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.195@KRBTEST.COM" created.
I20260504 14:07:50.510303 30596 heartbeater.cc:507] Master 127.25.254.254:42507 requested a full tablet report, sending...
I20260504 14:07:50.511751 30267 ts_manager.cc:194] Registered new tserver with Master: d1303e42ae7745b5bf9bfe3942a4719c (127.25.254.194:43549)
I20260504 14:07:50.512622 30267 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.194@KRBTEST.COM'} at 127.25.254.194:37187
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:50.562397 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:0
--local_ip_for_outbound_sockets=127.25.254.195
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:42507
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:38391
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:50.670987 30603 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:50.671227 30603 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:50.671285 30603 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:50.674595 30603 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:50.674664 30603 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:50.674742 30603 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:07:50.679167 30603 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38391
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:42507
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.30603
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:50.680186 30603 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:50.681008 30603 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:50.687616 30608 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:50.687593 30611 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:50.687593 30609 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:50.688052 30603 server_base.cc:1061] running on GCE node
I20260504 14:07:50.688481 30603 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:50.689071 30603 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:50.690241 30603 hybrid_clock.cc:648] HybridClock initialized: now 1777903670690223 us; error 55 us; skew 500 ppm
May 04 14:07:50 dist-test-slave-2x32 krb5kdc[30219](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903670, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:50.693222 30603 init.cc:377] Logged in from keytab as kudu/127.25.254.195@KRBTEST.COM (short username kudu)
I20260504 14:07:50.694314 30603 webserver.cc:492] Webserver started at http://127.25.254.195:38137/ using document root <none> and password file <none>
I20260504 14:07:50.694861 30603 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:50.694907 30603 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:50.695071 30603 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:50.696748 30603 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/data/instance:
uuid: "855274f2f61140e0856305d5f01a5d68"
format_stamp: "Formatted at 2026-05-04 14:07:50 on dist-test-slave-2x32"
server_key: "a4202457a605374677efb19111165153"
server_key_iv: "ec3a45d14fdc08b00024ef7ce92c497c"
server_key_version: "encryptionkey@0"
I20260504 14:07:50.697189 30603 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance:
uuid: "855274f2f61140e0856305d5f01a5d68"
format_stamp: "Formatted at 2026-05-04 14:07:50 on dist-test-slave-2x32"
server_key: "a4202457a605374677efb19111165153"
server_key_iv: "ec3a45d14fdc08b00024ef7ce92c497c"
server_key_version: "encryptionkey@0"
I20260504 14:07:50.700568 30603 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.002s	sys 0.003s
I20260504 14:07:50.702848 30618 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:50.703891 30603 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:07:50.704005 30603 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "855274f2f61140e0856305d5f01a5d68"
format_stamp: "Formatted at 2026-05-04 14:07:50 on dist-test-slave-2x32"
server_key: "a4202457a605374677efb19111165153"
server_key_iv: "ec3a45d14fdc08b00024ef7ce92c497c"
server_key_version: "encryptionkey@0"
I20260504 14:07:50.704088 30603 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:50.746101 30603 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:50.749389 30603 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:50.749558 30603 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:50.750109 30603 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:50.751273 30603 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:50.751323 30603 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:50.751366 30603 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:50.751381 30603 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:50.762390 30603 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:37885
I20260504 14:07:50.762415 30731 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:37885 every 8 connection(s)
I20260504 14:07:50.763428 30603 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
I20260504 14:07:50.769510 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 30603
I20260504 14:07:50.769685 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestUnauthorizedClientKerberosCredentials.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance
I20260504 14:07:50.769966 26619 external_mini_cluster.cc:1468] Setting key 8e0a0e7d8c2f1d6c5dc59bbb3b3c7b79
May 04 14:07:50 dist-test-slave-2x32 krb5kdc[30219](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903670, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:50.778108 30306 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:50.765413 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:33419 (local address 127.25.254.254:42507)
0504 14:07:50.765660 (+   247us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:50.765664 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:50.766417 (+   753us) server_negotiation.cc:408] Connection header received
0504 14:07:50.767895 (+  1478us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:50.767900 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:50.767970 (+    70us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:50.768081 (+   111us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:50.769458 (+  1377us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:50.770059 (+   601us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:50.770857 (+   798us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:50.771060 (+   203us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:50.774348 (+  3288us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:50.774378 (+    30us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:50.774380 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:50.774410 (+    30us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:50.775950 (+  1540us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:50.776630 (+   680us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:50.776634 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:50.776635 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:50.776687 (+    52us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:50.777061 (+   374us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:50.777064 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:50.777066 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:50.777228 (+   162us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:50.777318 (+    90us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:50.777862 (+   544us) server_negotiation.cc:300] Negotiation successful
0504 14:07:50.777976 (+   114us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":139}
I20260504 14:07:50.778882 30734 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:50.765735 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:42507 (local address 127.25.254.195:33419)
0504 14:07:50.766265 (+   530us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:50.766310 (+    45us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:50.767387 (+  1077us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:50.768266 (+   879us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:50.768275 (+     9us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:50.768791 (+   516us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:50.769287 (+   496us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:50.769299 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:50.770229 (+   930us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:50.770232 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:50.770734 (+   502us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:50.770743 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:50.770979 (+   236us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:50.772181 (+  1202us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:50.772204 (+    23us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:50.774088 (+  1884us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:50.776125 (+  2037us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:50.776131 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:50.776143 (+    12us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:50.776428 (+   285us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:50.776812 (+   384us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:50.776816 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:50.776818 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:50.776928 (+   110us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:50.777336 (+   408us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:50.777342 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:50.777592 (+   250us) client_negotiation.cc:770] Sending connection context
0504 14:07:50.777810 (+   218us) client_negotiation.cc:241] Negotiation successful
0504 14:07:50.778019 (+   209us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":278,"thread_start_us":117,"threads_started":1}
I20260504 14:07:50.780130 30732 heartbeater.cc:344] Connected to a master server at 127.25.254.254:42507
I20260504 14:07:50.780401 30732 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:50.780941 30732 heartbeater.cc:507] Master 127.25.254.254:42507 requested a full tablet report, sending...
I20260504 14:07:50.782130 30267 ts_manager.cc:194] Registered new tserver with Master: 855274f2f61140e0856305d5f01a5d68 (127.25.254.195:37885)
I20260504 14:07:50.782728 30267 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.195@KRBTEST.COM'} at 127.25.254.195:33419
I20260504 14:07:50.784338 26619 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
May 04 14:07:50 dist-test-slave-2x32 krb5kdc[30219](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903670, etypes {rep=17 tkt=17 ses=17}, joe-interloper@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for joe-interloper@KRBTEST.COM: 
May 04 14:07:50 dist-test-slave-2x32 krb5kdc[30219](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903670, etypes {rep=17 tkt=17 ses=17}, joe-interloper@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:50.810729 30306 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:50.802195 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:34266 (local address 127.25.254.254:42507)
0504 14:07:50.802489 (+   294us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:50.802493 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:50.802584 (+    91us) server_negotiation.cc:408] Connection header received
0504 14:07:50.802735 (+   151us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:50.802738 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:50.802789 (+    51us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:50.802858 (+    69us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:50.803754 (+   896us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:50.804283 (+   529us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:50.805037 (+   754us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:50.805187 (+   150us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:50.807496 (+  2309us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:50.807525 (+    29us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:50.807532 (+     7us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:50.807572 (+    40us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:50.809039 (+  1467us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:50.809432 (+   393us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:50.809434 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:50.809436 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:50.809489 (+    53us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:50.809781 (+   292us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:50.809783 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:50.809785 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:50.809970 (+   185us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:50.810079 (+   109us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:50.810466 (+   387us) server_negotiation.cc:300] Negotiation successful
0504 14:07:50.810583 (+   117us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":187}
W20260504 14:07:50.811264 30267 server_base.cc:1143] Unauthorized access attempt to method kudu.master.MasterService.ConnectToMaster from {username='joe-interloper', principal='joe-interloper@KRBTEST.COM'} at 127.0.0.1:34266
I20260504 14:07:50.812626 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 30331
I20260504 14:07:50.819133 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 30467
I20260504 14:07:50.825023 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 30603
I20260504 14:07:50.830821 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 30235
2026-05-04T14:07:50Z chronyd exiting
[       OK ] SecurityITest.TestUnauthorizedClientKerberosCredentials (3248 ms)
[ RUN      ] SecurityITest.TestAuthorizedSuperuser
Loading random data
Initializing database '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/principal' for realm 'KRBTEST.COM',
master key name 'K/M@KRBTEST.COM'
May 04 14:07:50 dist-test-slave-2x32 krb5kdc[30744](info): setting up network...
krb5kdc: setsockopt(10,IPV6_V6ONLY,1) worked
May 04 14:07:50 dist-test-slave-2x32 krb5kdc[30744](info): set up 2 sockets
May 04 14:07:50 dist-test-slave-2x32 krb5kdc[30744](info): commencing operation
krb5kdc: starting...
W20260504 14:07:52.886755 26619 mini_kdc.cc:121] Time spent starting KDC: real 2.033s	user 0.001s	sys 0.005s
WARNING: no policy specified for test-admin@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-admin@KRBTEST.COM" created.
WARNING: no policy specified for test-user@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-user@KRBTEST.COM" created.
WARNING: no policy specified for joe-interloper@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "joe-interloper@KRBTEST.COM" created.
Authenticating as principal slave/admin@KRBTEST.COM with password.
Entry for principal test-user with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/test-user.keytab.
Entry for principal test-user with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/test-user.keytab.
May 04 14:07:52 dist-test-slave-2x32 krb5kdc[30744](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903672, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
2026-05-04T14:07:52Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:07:52Z Disabled control of system clock
WARNING: no policy specified for kudu/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:53.046969 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:36015
--webserver_interface=127.25.254.254
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:41291
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:36015
--encrypt_data_at_rest=true
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:53.156273 30760 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:53.156581 30760 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:53.156700 30760 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:53.160377 30760 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:07:53.160499 30760 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:53.160544 30760 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:53.160605 30760 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:07:53.160665 30760 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:07:53.165612 30760 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:41291
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:36015
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:36015
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.30760
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:53.166975 30760 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:53.168012 30760 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:53.174017 30765 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:53.174083 30760 server_base.cc:1061] running on GCE node
W20260504 14:07:53.174017 30766 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:53.174017 30768 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:53.174880 30760 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:53.175864 30760 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:53.177125 30760 hybrid_clock.cc:648] HybridClock initialized: now 1777903673177101 us; error 55 us; skew 500 ppm
May 04 14:07:53 dist-test-slave-2x32 krb5kdc[30744](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903673, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:53.180238 30760 init.cc:377] Logged in from keytab as kudu/127.25.254.254@KRBTEST.COM (short username kudu)
I20260504 14:07:53.181478 30760 webserver.cc:492] Webserver started at http://127.25.254.254:34949/ using document root <none> and password file <none>
I20260504 14:07:53.182099 30760 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:53.182206 30760 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:53.182423 30760 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:53.184223 30760 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "56b9e7b89e4e4c4e9f391073fb4d39c9"
format_stamp: "Formatted at 2026-05-04 14:07:53 on dist-test-slave-2x32"
server_key: "5aeac4a209385c95555db6c9445a3dae"
server_key_iv: "54aa0ab0d0128c0808c06174b4dc6b5d"
server_key_version: "encryptionkey@0"
I20260504 14:07:53.184700 30760 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "56b9e7b89e4e4c4e9f391073fb4d39c9"
format_stamp: "Formatted at 2026-05-04 14:07:53 on dist-test-slave-2x32"
server_key: "5aeac4a209385c95555db6c9445a3dae"
server_key_iv: "54aa0ab0d0128c0808c06174b4dc6b5d"
server_key_version: "encryptionkey@0"
I20260504 14:07:53.188408 30760 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.003s	sys 0.002s
I20260504 14:07:53.190840 30775 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:53.192029 30760 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:07:53.192179 30760 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "56b9e7b89e4e4c4e9f391073fb4d39c9"
format_stamp: "Formatted at 2026-05-04 14:07:53 on dist-test-slave-2x32"
server_key: "5aeac4a209385c95555db6c9445a3dae"
server_key_iv: "54aa0ab0d0128c0808c06174b4dc6b5d"
server_key_version: "encryptionkey@0"
I20260504 14:07:53.192299 30760 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:53.209686 30760 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:53.212991 30760 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:53.213222 30760 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:53.221217 30760 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:36015
I20260504 14:07:53.221227 30827 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:36015 every 8 connection(s)
I20260504 14:07:53.222350 30760 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:07:53.223146 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 30760
I20260504 14:07:53.223272 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:07:53.223526 26619 external_mini_cluster.cc:1468] Setting key 70c0ee88231276bf7f779ce36e701784
I20260504 14:07:53.226245 30828 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
May 04 14:07:53 dist-test-slave-2x32 krb5kdc[30744](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903672, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:53.232137 30828 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9: Bootstrap starting.
I20260504 14:07:53.234609 30828 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:53.235327 30828 log.cc:826] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:53.237036 30831 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:53.224870 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:53972 (local address 127.25.254.254:36015)
0504 14:07:53.225268 (+   398us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:53.225278 (+    10us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:53.225308 (+    30us) server_negotiation.cc:408] Connection header received
0504 14:07:53.225988 (+   680us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:53.226006 (+    18us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:53.226400 (+   394us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:53.226769 (+   369us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:53.227687 (+   918us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:53.228456 (+   769us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:53.229409 (+   953us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:53.229756 (+   347us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:53.232448 (+  2692us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:53.232511 (+    63us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:53.232523 (+    12us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:53.232563 (+    40us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:53.234397 (+  1834us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:53.234933 (+   536us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:53.234938 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:53.234943 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:53.235018 (+    75us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:53.235301 (+   283us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:53.235305 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:53.235307 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:53.235698 (+   391us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:53.235892 (+   194us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:53.236223 (+   331us) server_negotiation.cc:300] Negotiation successful
0504 14:07:53.236456 (+   233us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":272,"thread_start_us":128,"threads_started":1}
I20260504 14:07:53.237480 30828 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9: No bootstrap required, opened a new log
I20260504 14:07:53.240581 30828 raft_consensus.cc:359] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "56b9e7b89e4e4c4e9f391073fb4d39c9" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 36015 } }
I20260504 14:07:53.240880 30828 raft_consensus.cc:385] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:53.240979 30828 raft_consensus.cc:740] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 56b9e7b89e4e4c4e9f391073fb4d39c9, State: Initialized, Role: FOLLOWER
I20260504 14:07:53.241467 30828 consensus_queue.cc:260] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "56b9e7b89e4e4c4e9f391073fb4d39c9" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 36015 } }
I20260504 14:07:53.241611 30828 raft_consensus.cc:399] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:07:53.241660 30828 raft_consensus.cc:493] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:07:53.241732 30828 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:53.242686 30828 raft_consensus.cc:515] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "56b9e7b89e4e4c4e9f391073fb4d39c9" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 36015 } }
I20260504 14:07:53.242988 30828 leader_election.cc:304] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 56b9e7b89e4e4c4e9f391073fb4d39c9; no voters: 
I20260504 14:07:53.243250 30828 leader_election.cc:290] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:07:53.243389 30833 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:53.243736 30833 raft_consensus.cc:697] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9 [term 1 LEADER]: Becoming Leader. State: Replica: 56b9e7b89e4e4c4e9f391073fb4d39c9, State: Running, Role: LEADER
I20260504 14:07:53.244064 30833 consensus_queue.cc:237] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "56b9e7b89e4e4c4e9f391073fb4d39c9" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 36015 } }
I20260504 14:07:53.244415 30828 sys_catalog.cc:565] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9 [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:07:53.245707 30834 sys_catalog.cc:455] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "56b9e7b89e4e4c4e9f391073fb4d39c9" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "56b9e7b89e4e4c4e9f391073fb4d39c9" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 36015 } } }
I20260504 14:07:53.245834 30834 sys_catalog.cc:458] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9 [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:53.246213 30835 sys_catalog.cc:455] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 56b9e7b89e4e4c4e9f391073fb4d39c9. Latest consensus state: current_term: 1 leader_uuid: "56b9e7b89e4e4c4e9f391073fb4d39c9" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "56b9e7b89e4e4c4e9f391073fb4d39c9" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 36015 } } }
I20260504 14:07:53.246313 30835 sys_catalog.cc:458] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9 [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:53.246251 30842 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:07:53.249037 30842 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:07:53.254555 30842 catalog_manager.cc:1357] Generated new cluster ID: a31cdaeed186481fb1f246a599f0678a
I20260504 14:07:53.254642 30842 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:07:53.286656 30842 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:07:53.287596 30842 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:07:53.294812 30842 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 56b9e7b89e4e4c4e9f391073fb4d39c9: Generated new TSK 0
I20260504 14:07:53.295598 30842 catalog_manager.cc:1524] Initializing in-progress tserver states...
WARNING: no policy specified for kudu/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:53.367244 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:0
--local_ip_for_outbound_sockets=127.25.254.193
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:36015
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:41291
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:53.475951 30856 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:53.476179 30856 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:53.476236 30856 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:53.479616 30856 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:53.479691 30856 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:53.479772 30856 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:07:53.484246 30856 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:41291
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-0/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:36015
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.30856
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:53.485693 30856 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:53.486758 30856 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:53.493966 30862 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:53.494007 30856 server_base.cc:1061] running on GCE node
W20260504 14:07:53.493954 30861 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:53.493954 30864 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:53.494639 30856 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:53.495308 30856 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:53.496519 30856 hybrid_clock.cc:648] HybridClock initialized: now 1777903673496468 us; error 59 us; skew 500 ppm
May 04 14:07:53 dist-test-slave-2x32 krb5kdc[30744](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903673, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:53.499598 30856 init.cc:377] Logged in from keytab as kudu/127.25.254.193@KRBTEST.COM (short username kudu)
I20260504 14:07:53.500963 30856 webserver.cc:492] Webserver started at http://127.25.254.193:38537/ using document root <none> and password file <none>
I20260504 14:07:53.501533 30856 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:53.501583 30856 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:53.501762 30856 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:53.503715 30856 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-0/data/instance:
uuid: "915e814c35ca4b06aad66e7633f3a0e8"
format_stamp: "Formatted at 2026-05-04 14:07:53 on dist-test-slave-2x32"
server_key: "0529f7be9aa8d39a4555e163eec2788c"
server_key_iv: "e8b4095aa45e53460c353323e234a21c"
server_key_version: "encryptionkey@0"
I20260504 14:07:53.504184 30856 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance:
uuid: "915e814c35ca4b06aad66e7633f3a0e8"
format_stamp: "Formatted at 2026-05-04 14:07:53 on dist-test-slave-2x32"
server_key: "0529f7be9aa8d39a4555e163eec2788c"
server_key_iv: "e8b4095aa45e53460c353323e234a21c"
server_key_version: "encryptionkey@0"
I20260504 14:07:53.507879 30856 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.004s	sys 0.000s
I20260504 14:07:53.510310 30871 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:53.511399 30856 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.001s
I20260504 14:07:53.511564 30856 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "915e814c35ca4b06aad66e7633f3a0e8"
format_stamp: "Formatted at 2026-05-04 14:07:53 on dist-test-slave-2x32"
server_key: "0529f7be9aa8d39a4555e163eec2788c"
server_key_iv: "e8b4095aa45e53460c353323e234a21c"
server_key_version: "encryptionkey@0"
I20260504 14:07:53.511662 30856 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:53.528091 30856 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:53.531188 30856 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:53.531363 30856 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:53.532015 30856 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:53.532959 30856 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:53.533010 30856 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:53.533077 30856 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:53.533107 30856 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:53.544442 30856 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:44793
I20260504 14:07:53.544466 30984 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:44793 every 8 connection(s)
I20260504 14:07:53.545575 30856 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
I20260504 14:07:53.554412 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 30856
I20260504 14:07:53.554564 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance
I20260504 14:07:53.554901 26619 external_mini_cluster.cc:1468] Setting key 2f03dd94b082f9b06f7fcb49c4e852a6
May 04 14:07:53 dist-test-slave-2x32 krb5kdc[30744](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903673, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:53.561414 30831 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:53.547939 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:60609 (local address 127.25.254.254:36015)
0504 14:07:53.548192 (+   253us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:53.548197 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:53.548731 (+   534us) server_negotiation.cc:408] Connection header received
0504 14:07:53.549722 (+   991us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:53.549726 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:53.549782 (+    56us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:53.549871 (+    89us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:53.551941 (+  2070us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:53.552499 (+   558us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:53.553432 (+   933us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:53.553591 (+   159us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:53.556617 (+  3026us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:53.556643 (+    26us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:53.556649 (+     6us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:53.556684 (+    35us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:53.558648 (+  1964us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:53.559275 (+   627us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:53.559282 (+     7us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:53.559285 (+     3us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:53.559349 (+    64us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:53.559942 (+   593us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:53.559950 (+     8us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:53.559955 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:53.560186 (+   231us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:53.560312 (+   126us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:53.560980 (+   668us) server_negotiation.cc:300] Negotiation successful
0504 14:07:53.561150 (+   170us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":131}
I20260504 14:07:53.562076 30987 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:53.548033 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:36015 (local address 127.25.254.193:60609)
0504 14:07:53.548523 (+   490us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:53.548598 (+    75us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:53.549457 (+   859us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:53.550335 (+   878us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:53.550345 (+    10us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:53.550811 (+   466us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:53.551711 (+   900us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:53.551726 (+    15us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:53.552765 (+  1039us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:53.552769 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:53.553253 (+   484us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:53.553261 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:53.553490 (+   229us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:53.554266 (+   776us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:53.554301 (+    35us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:53.556380 (+  2079us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:53.558855 (+  2475us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:53.558862 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:53.558873 (+    11us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:53.559136 (+   263us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:53.559567 (+   431us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:53.559570 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:53.559572 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:53.559694 (+   122us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:53.560338 (+   644us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:53.560350 (+    12us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:53.560681 (+   331us) client_negotiation.cc:770] Sending connection context
0504 14:07:53.560881 (+   200us) client_negotiation.cc:241] Negotiation successful
0504 14:07:53.561104 (+   223us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":296,"thread_start_us":111,"threads_started":1}
I20260504 14:07:53.563525 30985 heartbeater.cc:344] Connected to a master server at 127.25.254.254:36015
I20260504 14:07:53.563843 30985 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:53.564572 30985 heartbeater.cc:507] Master 127.25.254.254:36015 requested a full tablet report, sending...
I20260504 14:07:53.566527 30792 ts_manager.cc:194] Registered new tserver with Master: 915e814c35ca4b06aad66e7633f3a0e8 (127.25.254.193:44793)
WARNING: no policy specified for kudu/127.25.254.194@KRBTEST.COM; defaulting to no policy
I20260504 14:07:53.567790 30792 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.193@KRBTEST.COM'} at 127.25.254.193:60609
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:53.627030 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:0
--local_ip_for_outbound_sockets=127.25.254.194
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:36015
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:41291
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:53.740347 30992 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:53.740612 30992 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:53.740674 30992 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:53.744407 30992 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:53.744491 30992 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:53.744577 30992 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:07:53.749923 30992 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:41291
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-1/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:36015
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.30992
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:53.751360 30992 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:53.752277 30992 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:53.759745 31000 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:53.759764 30998 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:53.759757 30997 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:53.759831 30992 server_base.cc:1061] running on GCE node
I20260504 14:07:53.760481 30992 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:53.761121 30992 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:53.762306 30992 hybrid_clock.cc:648] HybridClock initialized: now 1777903673762293 us; error 28 us; skew 500 ppm
May 04 14:07:53 dist-test-slave-2x32 krb5kdc[30744](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903673, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:53.765101 30992 init.cc:377] Logged in from keytab as kudu/127.25.254.194@KRBTEST.COM (short username kudu)
I20260504 14:07:53.766256 30992 webserver.cc:492] Webserver started at http://127.25.254.194:44741/ using document root <none> and password file <none>
I20260504 14:07:53.766894 30992 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:53.766947 30992 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:53.767161 30992 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:53.769407 30992 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-1/data/instance:
uuid: "c4737909287e47439efae27fd4865a57"
format_stamp: "Formatted at 2026-05-04 14:07:53 on dist-test-slave-2x32"
server_key: "7d6d5d0e6758e786ee8631079dec3d2b"
server_key_iv: "e0b1cf59e9471b0ffe49ed93c3b9aeef"
server_key_version: "encryptionkey@0"
I20260504 14:07:53.770469 30992 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance:
uuid: "c4737909287e47439efae27fd4865a57"
format_stamp: "Formatted at 2026-05-04 14:07:53 on dist-test-slave-2x32"
server_key: "7d6d5d0e6758e786ee8631079dec3d2b"
server_key_iv: "e0b1cf59e9471b0ffe49ed93c3b9aeef"
server_key_version: "encryptionkey@0"
I20260504 14:07:53.774793 30992 fs_manager.cc:696] Time spent creating directory manager: real 0.004s	user 0.003s	sys 0.000s
I20260504 14:07:53.778050 31007 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:53.779754 30992 fs_manager.cc:730] Time spent opening block manager: real 0.004s	user 0.004s	sys 0.000s
I20260504 14:07:53.779899 30992 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "c4737909287e47439efae27fd4865a57"
format_stamp: "Formatted at 2026-05-04 14:07:53 on dist-test-slave-2x32"
server_key: "7d6d5d0e6758e786ee8631079dec3d2b"
server_key_iv: "e0b1cf59e9471b0ffe49ed93c3b9aeef"
server_key_version: "encryptionkey@0"
I20260504 14:07:53.780076 30992 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:53.796262 30992 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:53.800006 30992 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:53.800192 30992 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:53.800751 30992 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:53.801731 30992 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:53.801812 30992 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:53.801920 30992 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:53.801963 30992 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:53.812291 30992 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:34787
I20260504 14:07:53.812343 31120 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:34787 every 8 connection(s)
I20260504 14:07:53.813378 30992 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
I20260504 14:07:53.814204 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 30992
I20260504 14:07:53.814335 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance
I20260504 14:07:53.814597 26619 external_mini_cluster.cc:1468] Setting key 574777244d72cdacc4ac1b2db7c61701
May 04 14:07:53 dist-test-slave-2x32 krb5kdc[30744](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903673, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
WARNING: no policy specified for kudu/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.195@KRBTEST.COM" created.
I20260504 14:07:53.829671 30831 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:53.815579 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:47431 (local address 127.25.254.254:36015)
0504 14:07:53.815717 (+   138us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:53.815721 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:53.817016 (+  1295us) server_negotiation.cc:408] Connection header received
0504 14:07:53.817903 (+   887us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:53.817911 (+     8us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:53.817989 (+    78us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:53.818211 (+   222us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:53.820063 (+  1852us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:53.820592 (+   529us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:53.821291 (+   699us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:53.821469 (+   178us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:53.824524 (+  3055us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:53.824550 (+    26us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:53.824552 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:53.824587 (+    35us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:53.826727 (+  2140us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:53.827438 (+   711us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:53.827442 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:53.827443 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:53.827516 (+    73us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:53.828028 (+   512us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:53.828035 (+     7us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:53.828040 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:53.828275 (+   235us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:53.828401 (+   126us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:53.829397 (+   996us) server_negotiation.cc:300] Negotiation successful
0504 14:07:53.829524 (+   127us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":46}
I20260504 14:07:53.830613 31123 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:53.816033 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:36015 (local address 127.25.254.194:47431)
0504 14:07:53.816850 (+   817us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:53.816892 (+    42us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:53.817684 (+   792us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:53.818394 (+   710us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:53.818406 (+    12us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:53.818993 (+   587us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:53.819887 (+   894us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:53.819900 (+    13us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:53.820732 (+   832us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:53.820742 (+    10us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:53.821154 (+   412us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:53.821161 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:53.821397 (+   236us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:53.822046 (+   649us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:53.822066 (+    20us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:53.824292 (+  2226us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:53.826934 (+  2642us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:53.826941 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:53.826953 (+    12us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:53.827244 (+   291us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:53.827696 (+   452us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:53.827700 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:53.827701 (+     1us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:53.827890 (+   189us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:53.828559 (+   669us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:53.828567 (+     8us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:53.829020 (+   453us) client_negotiation.cc:770] Sending connection context
0504 14:07:53.829304 (+   284us) client_negotiation.cc:241] Negotiation successful
0504 14:07:53.829618 (+   314us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":567,"thread_start_us":280,"threads_started":1}
I20260504 14:07:53.831988 31121 heartbeater.cc:344] Connected to a master server at 127.25.254.254:36015
I20260504 14:07:53.832407 31121 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:53.833115 31121 heartbeater.cc:507] Master 127.25.254.254:36015 requested a full tablet report, sending...
I20260504 14:07:53.834446 30792 ts_manager.cc:194] Registered new tserver with Master: c4737909287e47439efae27fd4865a57 (127.25.254.194:34787)
I20260504 14:07:53.835116 30792 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.194@KRBTEST.COM'} at 127.25.254.194:47431
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:53.881604 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:0
--local_ip_for_outbound_sockets=127.25.254.195
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:36015
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:41291
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:53.991698 31128 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:53.991928 31128 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:53.992024 31128 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:53.995630 31128 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:53.995712 31128 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:53.995843 31128 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:07:54.000366 31128 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:41291
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-2/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:36015
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.31128
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:54.001514 31128 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:54.002449 31128 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:54.009850 31136 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:54.009922 31134 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:54.009850 31133 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:54.010180 31128 server_base.cc:1061] running on GCE node
I20260504 14:07:54.010675 31128 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:54.011265 31128 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:54.012475 31128 hybrid_clock.cc:648] HybridClock initialized: now 1777903674012438 us; error 50 us; skew 500 ppm
May 04 14:07:54 dist-test-slave-2x32 krb5kdc[30744](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903674, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:54.015654 31128 init.cc:377] Logged in from keytab as kudu/127.25.254.195@KRBTEST.COM (short username kudu)
I20260504 14:07:54.016932 31128 webserver.cc:492] Webserver started at http://127.25.254.195:40145/ using document root <none> and password file <none>
I20260504 14:07:54.017547 31128 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:54.017623 31128 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:54.017846 31128 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:54.019836 31128 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-2/data/instance:
uuid: "b7866727efb04458bb489c4b4cd7cc5d"
format_stamp: "Formatted at 2026-05-04 14:07:54 on dist-test-slave-2x32"
server_key: "bc78fe0326336d4415d8d64f501a2386"
server_key_iv: "a2a575a392991cead1515e2bcdf17288"
server_key_version: "encryptionkey@0"
I20260504 14:07:54.020366 31128 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance:
uuid: "b7866727efb04458bb489c4b4cd7cc5d"
format_stamp: "Formatted at 2026-05-04 14:07:54 on dist-test-slave-2x32"
server_key: "bc78fe0326336d4415d8d64f501a2386"
server_key_iv: "a2a575a392991cead1515e2bcdf17288"
server_key_version: "encryptionkey@0"
I20260504 14:07:54.024048 31128 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.005s	sys 0.000s
I20260504 14:07:54.026623 31143 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:54.027801 31128 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.001s
I20260504 14:07:54.027963 31128 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "b7866727efb04458bb489c4b4cd7cc5d"
format_stamp: "Formatted at 2026-05-04 14:07:54 on dist-test-slave-2x32"
server_key: "bc78fe0326336d4415d8d64f501a2386"
server_key_iv: "a2a575a392991cead1515e2bcdf17288"
server_key_version: "encryptionkey@0"
I20260504 14:07:54.028091 31128 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:54.042538 31128 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:54.045364 31128 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:54.045571 31128 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:54.046231 31128 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:54.047268 31128 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:54.047338 31128 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:54.047402 31128 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:54.047454 31128 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:54.057431 31128 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:45711
I20260504 14:07:54.057451 31256 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:45711 every 8 connection(s)
I20260504 14:07:54.058470 31128 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
May 04 14:07:54 dist-test-slave-2x32 krb5kdc[30744](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903674, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:54.068372 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 31128
I20260504 14:07:54.068512 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestAuthorizedSuperuser.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance
I20260504 14:07:54.068825 26619 external_mini_cluster.cc:1468] Setting key 9652d4290c19476e3ff2fc657a3009ac
I20260504 14:07:54.071719 30831 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:54.060350 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:57223 (local address 127.25.254.254:36015)
0504 14:07:54.060493 (+   143us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:54.060498 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:54.061166 (+   668us) server_negotiation.cc:408] Connection header received
0504 14:07:54.061971 (+   805us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:54.061974 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:54.062022 (+    48us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:54.062103 (+    81us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:54.063833 (+  1730us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:54.064333 (+   500us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:54.065087 (+   754us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:54.065226 (+   139us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:54.067446 (+  2220us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:54.067466 (+    20us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:54.067469 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:54.067496 (+    27us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:54.069373 (+  1877us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:54.070088 (+   715us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:54.070094 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:54.070097 (+     3us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:54.070154 (+    57us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:54.070609 (+   455us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:54.070613 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:54.070615 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:54.070803 (+   188us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:54.070904 (+   101us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:54.071475 (+   571us) server_negotiation.cc:300] Negotiation successful
0504 14:07:54.071600 (+   125us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":54}
I20260504 14:07:54.072381 31259 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:54.060610 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:36015 (local address 127.25.254.195:57223)
0504 14:07:54.061023 (+   413us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:54.061059 (+    36us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:54.061786 (+   727us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:54.062371 (+   585us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:54.062380 (+     9us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:54.062762 (+   382us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:54.063635 (+   873us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:54.063650 (+    15us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:54.064489 (+   839us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:54.064493 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:54.064966 (+   473us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:54.064974 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:54.065139 (+   165us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:54.065722 (+   583us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:54.065741 (+    19us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:54.067293 (+  1552us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:54.069502 (+  2209us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:54.069509 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:54.069523 (+    14us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:54.069951 (+   428us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:54.070289 (+   338us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:54.070293 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:54.070295 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:54.070437 (+   142us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:54.070920 (+   483us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:54.070925 (+     5us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:54.071165 (+   240us) client_negotiation.cc:770] Sending connection context
0504 14:07:54.071433 (+   268us) client_negotiation.cc:241] Negotiation successful
0504 14:07:54.071659 (+   226us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":238,"thread_start_us":94,"threads_started":1}
I20260504 14:07:54.073495 31257 heartbeater.cc:344] Connected to a master server at 127.25.254.254:36015
I20260504 14:07:54.073776 31257 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:54.074410 31257 heartbeater.cc:507] Master 127.25.254.254:36015 requested a full tablet report, sending...
I20260504 14:07:54.075435 30792 ts_manager.cc:194] Registered new tserver with Master: b7866727efb04458bb489c4b4cd7cc5d (127.25.254.195:45711)
I20260504 14:07:54.076004 30792 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.195@KRBTEST.COM'} at 127.25.254.195:57223
I20260504 14:07:54.083452 26619 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
May 04 14:07:54 dist-test-slave-2x32 krb5kdc[30744](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903674, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
May 04 14:07:54 dist-test-slave-2x32 krb5kdc[30744](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903674, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.193@KRBTEST.COM
I20260504 14:07:54.109587 31264 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:54.099883 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:41050 (local address 127.25.254.193:44793)
0504 14:07:54.100180 (+   297us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:54.100187 (+     7us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:54.100227 (+    40us) server_negotiation.cc:408] Connection header received
0504 14:07:54.100284 (+    57us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:54.100288 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:54.100452 (+   164us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:54.100574 (+   122us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:54.101414 (+   840us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:54.102298 (+   884us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:54.103001 (+   703us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:54.103249 (+   248us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:54.105933 (+  2684us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:54.105960 (+    27us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:54.105971 (+    11us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:54.106000 (+    29us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:54.107926 (+  1926us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:54.108377 (+   451us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:54.108382 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:54.108383 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:54.108447 (+    64us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:54.108723 (+   276us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:54.108725 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:54.108727 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:54.108909 (+   182us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:54.109056 (+   147us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:54.109292 (+   236us) server_negotiation.cc:300] Negotiation successful
0504 14:07:54.109421 (+   129us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":181,"thread_start_us":128,"threads_started":1}
May 04 14:07:54 dist-test-slave-2x32 krb5kdc[30744](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903674, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:54.122531 30831 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:54.112610 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:53988 (local address 127.25.254.254:36015)
0504 14:07:54.112767 (+   157us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:54.112771 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:54.112946 (+   175us) server_negotiation.cc:408] Connection header received
0504 14:07:54.112995 (+    49us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:54.112998 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:54.113049 (+    51us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:54.113140 (+    91us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:54.113940 (+   800us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:54.114529 (+   589us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:54.115230 (+   701us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:54.115372 (+   142us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:54.117619 (+  2247us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:54.117639 (+    20us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:54.117641 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:54.117670 (+    29us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:54.120888 (+  3218us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:54.121355 (+   467us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:54.121357 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:54.121359 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:54.121408 (+    49us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:54.121676 (+   268us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:54.121680 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:54.121681 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:54.121877 (+   196us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:54.121999 (+   122us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:54.122270 (+   271us) server_negotiation.cc:300] Negotiation successful
0504 14:07:54.122393 (+   123us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":57}
W20260504 14:07:54.123010 30792 server_base.cc:1143] Unauthorized access attempt to method kudu.master.MasterService.TSHeartbeat from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:53988
I20260504 14:07:54.123767 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 30856
I20260504 14:07:54.130369 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 30992
I20260504 14:07:54.136534 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 31128
I20260504 14:07:54.142352 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 30760
2026-05-04T14:07:54Z chronyd exiting
[       OK ] SecurityITest.TestAuthorizedSuperuser (3310 ms)
[ RUN      ] SecurityITest.TestDisableWebUI
Loading random data
Initializing database '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/principal' for realm 'KRBTEST.COM',
master key name 'K/M@KRBTEST.COM'
May 04 14:07:54 dist-test-slave-2x32 krb5kdc[31269](info): setting up network...
krb5kdc: setsockopt(10,IPV6_V6ONLY,1) worked
May 04 14:07:54 dist-test-slave-2x32 krb5kdc[31269](info): set up 2 sockets
May 04 14:07:54 dist-test-slave-2x32 krb5kdc[31269](info): commencing operation
krb5kdc: starting...
W20260504 14:07:56.222635 26619 mini_kdc.cc:121] Time spent starting KDC: real 2.058s	user 0.000s	sys 0.006s
WARNING: no policy specified for test-admin@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-admin@KRBTEST.COM" created.
WARNING: no policy specified for test-user@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-user@KRBTEST.COM" created.
WARNING: no policy specified for joe-interloper@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "joe-interloper@KRBTEST.COM" created.
Authenticating as principal slave/admin@KRBTEST.COM with password.
Entry for principal test-user with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/test-user.keytab.
Entry for principal test-user with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/test-user.keytab.
May 04 14:07:56 dist-test-slave-2x32 krb5kdc[31269](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903676, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
2026-05-04T14:07:56Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:07:56Z Disabled control of system clock
WARNING: no policy specified for kudu/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:56.374118 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:38955
--webserver_interface=127.25.254.254
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:38875
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:38955
--encrypt_data_at_rest=true
--rpc_trace_negotiation
--webserver_enabled=0 with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:56.482677 31285 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:56.482930 31285 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:56.483033 31285 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:56.486558 31285 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:07:56.486634 31285 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:56.486660 31285 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:56.486680 31285 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:07:56.486735 31285 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:07:56.491437 31285 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38875
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:38955
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:38955
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--webserver_enabled=false
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.31285
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:56.492626 31285 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:56.493533 31285 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:56.499403 31290 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:56.499396 31291 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:56.499396 31293 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:56.499747 31285 server_base.cc:1061] running on GCE node
I20260504 14:07:56.500288 31285 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:56.501279 31285 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:56.502458 31285 hybrid_clock.cc:648] HybridClock initialized: now 1777903676502458 us; error 65 us; skew 500 ppm
May 04 14:07:56 dist-test-slave-2x32 krb5kdc[31269](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903676, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:56.505182 31285 init.cc:377] Logged in from keytab as kudu/127.25.254.254@KRBTEST.COM (short username kudu)
I20260504 14:07:56.505956 31285 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:56.506019 31285 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:56.506268 31285 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:56.508047 31285 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "f0520c9b48f64804bfdda570ef59f3ec"
format_stamp: "Formatted at 2026-05-04 14:07:56 on dist-test-slave-2x32"
server_key: "b2505dea1048f3c14506debf98c7ec3b"
server_key_iv: "f6087ddbed50c9e8dd621e9d509556fc"
server_key_version: "encryptionkey@0"
I20260504 14:07:56.508496 31285 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "f0520c9b48f64804bfdda570ef59f3ec"
format_stamp: "Formatted at 2026-05-04 14:07:56 on dist-test-slave-2x32"
server_key: "b2505dea1048f3c14506debf98c7ec3b"
server_key_iv: "f6087ddbed50c9e8dd621e9d509556fc"
server_key_version: "encryptionkey@0"
I20260504 14:07:56.512111 31285 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.005s	sys 0.000s
I20260504 14:07:56.514451 31299 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:56.515568 31285 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:07:56.515755 31285 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "f0520c9b48f64804bfdda570ef59f3ec"
format_stamp: "Formatted at 2026-05-04 14:07:56 on dist-test-slave-2x32"
server_key: "b2505dea1048f3c14506debf98c7ec3b"
server_key_iv: "f6087ddbed50c9e8dd621e9d509556fc"
server_key_version: "encryptionkey@0"
I20260504 14:07:56.515862 31285 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:56.545367 31285 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:56.548944 31285 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:56.549190 31285 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:56.559904 31285 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:38955
I20260504 14:07:56.559892 31351 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:38955 every 8 connection(s)
I20260504 14:07:56.560720 31285 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:07:56.564307 31352 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:56.569831 31352 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec: Bootstrap starting.
I20260504 14:07:56.571009 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 31285
I20260504 14:07:56.571153 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:07:56.571434 26619 external_mini_cluster.cc:1468] Setting key 987a77c03a62d9eb6f2cf495b2edc611
I20260504 14:07:56.572865 31352 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:56.573613 31352 log.cc:826] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:56.575868 31352 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec: No bootstrap required, opened a new log
I20260504 14:07:56.578933 31352 raft_consensus.cc:359] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "f0520c9b48f64804bfdda570ef59f3ec" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 38955 } }
I20260504 14:07:56.579203 31352 raft_consensus.cc:385] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:56.579269 31352 raft_consensus.cc:740] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f0520c9b48f64804bfdda570ef59f3ec, State: Initialized, Role: FOLLOWER
I20260504 14:07:56.579797 31352 consensus_queue.cc:260] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "f0520c9b48f64804bfdda570ef59f3ec" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 38955 } }
I20260504 14:07:56.579975 31352 raft_consensus.cc:399] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:07:56.580057 31352 raft_consensus.cc:493] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:07:56.580183 31352 raft_consensus.cc:3060] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec [term 0 FOLLOWER]: Advancing to term 1
May 04 14:07:56 dist-test-slave-2x32 krb5kdc[31269](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903676, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:56.581211 31352 raft_consensus.cc:515] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "f0520c9b48f64804bfdda570ef59f3ec" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 38955 } }
I20260504 14:07:56.581630 31352 leader_election.cc:304] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: f0520c9b48f64804bfdda570ef59f3ec; no voters: 
I20260504 14:07:56.582198 31352 leader_election.cc:290] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:07:56.582299 31357 raft_consensus.cc:2804] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:56.582517 31357 raft_consensus.cc:697] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec [term 1 LEADER]: Becoming Leader. State: Replica: f0520c9b48f64804bfdda570ef59f3ec, State: Running, Role: LEADER
I20260504 14:07:56.582955 31357 consensus_queue.cc:237] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "f0520c9b48f64804bfdda570ef59f3ec" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 38955 } }
I20260504 14:07:56.583606 31352 sys_catalog.cc:565] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:07:56.584615 31358 sys_catalog.cc:455] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "f0520c9b48f64804bfdda570ef59f3ec" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "f0520c9b48f64804bfdda570ef59f3ec" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 38955 } } }
I20260504 14:07:56.584705 31359 sys_catalog.cc:455] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec [sys.catalog]: SysCatalogTable state changed. Reason: New leader f0520c9b48f64804bfdda570ef59f3ec. Latest consensus state: current_term: 1 leader_uuid: "f0520c9b48f64804bfdda570ef59f3ec" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "f0520c9b48f64804bfdda570ef59f3ec" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 38955 } } }
I20260504 14:07:56.584830 31359 sys_catalog.cc:458] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:56.584765 31358 sys_catalog.cc:458] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:56.585286 31366 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:07:56.587822 31355 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:56.572614 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:44328 (local address 127.25.254.254:38955)
0504 14:07:56.573221 (+   607us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:56.573232 (+    11us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:56.573271 (+    39us) server_negotiation.cc:408] Connection header received
0504 14:07:56.573963 (+   692us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:56.574005 (+    42us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:56.574419 (+   414us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:56.574828 (+   409us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:56.575892 (+  1064us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:56.576751 (+   859us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:56.577444 (+   693us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:56.577772 (+   328us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:56.581169 (+  3397us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:56.581203 (+    34us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:56.581217 (+    14us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:56.581252 (+    35us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:56.584430 (+  3178us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:56.585039 (+   609us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:56.585046 (+     7us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:56.585054 (+     8us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:56.585168 (+   114us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:56.585484 (+   316us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:56.585488 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:56.585490 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:56.585965 (+   475us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:56.586193 (+   228us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:56.586691 (+   498us) server_negotiation.cc:300] Negotiation successful
0504 14:07:56.587029 (+   338us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":408,"thread_start_us":144,"threads_started":1}
I20260504 14:07:56.588868 31366 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:07:56.594496 31366 catalog_manager.cc:1357] Generated new cluster ID: 6614d3912c2b454e9fa9fe09689b3771
I20260504 14:07:56.594588 31366 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:07:56.649899 31366 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:07:56.650887 31366 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:07:56.660362 31366 catalog_manager.cc:6044] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec: Generated new TSK 0
I20260504 14:07:56.661072 31366 catalog_manager.cc:1524] Initializing in-progress tserver states...
WARNING: no policy specified for kudu/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:56.726094 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:0
--local_ip_for_outbound_sockets=127.25.254.193
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:38955
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:38875
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--webserver_enabled=0 with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:56.835815 31380 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:56.836071 31380 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:56.836136 31380 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:56.839848 31380 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:56.839938 31380 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:56.840066 31380 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:07:56.844877 31380 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38875
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-0/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--webserver_enabled=false
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:38955
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.31380
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:56.846081 31380 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:56.847025 31380 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:56.854008 31388 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:56.854005 31386 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:56.854005 31385 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:56.854449 31380 server_base.cc:1061] running on GCE node
I20260504 14:07:56.855015 31380 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:56.855579 31380 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:56.856755 31380 hybrid_clock.cc:648] HybridClock initialized: now 1777903676856725 us; error 46 us; skew 500 ppm
May 04 14:07:56 dist-test-slave-2x32 krb5kdc[31269](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903676, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:56.859968 31380 init.cc:377] Logged in from keytab as kudu/127.25.254.193@KRBTEST.COM (short username kudu)
I20260504 14:07:56.860955 31380 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:56.861032 31380 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:56.861307 31380 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:56.863334 31380 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-0/data/instance:
uuid: "6b71a80afb2d4b50b13e3dc6b935099c"
format_stamp: "Formatted at 2026-05-04 14:07:56 on dist-test-slave-2x32"
server_key: "91810fcec597853fc95bbdc082c76aee"
server_key_iv: "5f25e1fdd99407d48727c11a42a5bfe8"
server_key_version: "encryptionkey@0"
I20260504 14:07:56.863950 31380 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance:
uuid: "6b71a80afb2d4b50b13e3dc6b935099c"
format_stamp: "Formatted at 2026-05-04 14:07:56 on dist-test-slave-2x32"
server_key: "91810fcec597853fc95bbdc082c76aee"
server_key_iv: "5f25e1fdd99407d48727c11a42a5bfe8"
server_key_version: "encryptionkey@0"
I20260504 14:07:56.868117 31380 fs_manager.cc:696] Time spent creating directory manager: real 0.004s	user 0.001s	sys 0.004s
I20260504 14:07:56.870748 31394 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:56.872027 31380 fs_manager.cc:730] Time spent opening block manager: real 0.003s	user 0.003s	sys 0.000s
I20260504 14:07:56.872148 31380 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "6b71a80afb2d4b50b13e3dc6b935099c"
format_stamp: "Formatted at 2026-05-04 14:07:56 on dist-test-slave-2x32"
server_key: "91810fcec597853fc95bbdc082c76aee"
server_key_iv: "5f25e1fdd99407d48727c11a42a5bfe8"
server_key_version: "encryptionkey@0"
I20260504 14:07:56.872309 31380 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:56.897612 31380 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:56.900533 31380 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:56.900763 31380 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:56.901423 31380 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:56.902521 31380 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:56.902586 31380 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:56.902647 31380 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:56.902686 31380 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:56.912953 31380 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:42145
I20260504 14:07:56.913064 31507 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:42145 every 8 connection(s)
I20260504 14:07:56.913501 31380 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
May 04 14:07:56 dist-test-slave-2x32 krb5kdc[31269](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903676, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:56.923048 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 31380
I20260504 14:07:56.923219 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance
I20260504 14:07:56.923532 26619 external_mini_cluster.cc:1468] Setting key bbab25e4efbdaf15e37197eaa8ed40c4
I20260504 14:07:56.926932 31355 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:56.915260 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:60959 (local address 127.25.254.254:38955)
0504 14:07:56.915419 (+   159us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:56.915423 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:56.916096 (+   673us) server_negotiation.cc:408] Connection header received
0504 14:07:56.917475 (+  1379us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:56.917480 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:56.917542 (+    62us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:56.917690 (+   148us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:56.918955 (+  1265us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:56.919463 (+   508us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:56.920143 (+   680us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:56.920327 (+   184us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:56.922863 (+  2536us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:56.922885 (+    22us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:56.922887 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:56.922915 (+    28us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:56.924671 (+  1756us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:56.925267 (+   596us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:56.925270 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:56.925272 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:56.925318 (+    46us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:56.925716 (+   398us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:56.925719 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:56.925720 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:56.925877 (+   157us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:56.925971 (+    94us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:56.926656 (+   685us) server_negotiation.cc:300] Negotiation successful
0504 14:07:56.926769 (+   113us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":61}
I20260504 14:07:56.927690 31510 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:56.915528 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:38955 (local address 127.25.254.193:60959)
0504 14:07:56.915954 (+   426us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:56.915987 (+    33us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:56.917256 (+  1269us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:56.917861 (+   605us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:56.917869 (+     8us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:56.918330 (+   461us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:56.918788 (+   458us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:56.918799 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:56.919593 (+   794us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:56.919597 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:56.920008 (+   411us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:56.920015 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:56.920307 (+   292us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:56.920931 (+   624us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:56.920952 (+    21us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:56.922608 (+  1656us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:56.924836 (+  2228us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:56.924843 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:56.924858 (+    15us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:56.925161 (+   303us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:56.925414 (+   253us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:56.925418 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:56.925420 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:56.925585 (+   165us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:56.926003 (+   418us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:56.926013 (+    10us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:56.926347 (+   334us) client_negotiation.cc:770] Sending connection context
0504 14:07:56.926612 (+   265us) client_negotiation.cc:241] Negotiation successful
0504 14:07:56.926900 (+   288us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":252,"thread_start_us":111,"threads_started":1}
I20260504 14:07:56.929128 31508 heartbeater.cc:344] Connected to a master server at 127.25.254.254:38955
I20260504 14:07:56.929422 31508 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:56.930003 31508 heartbeater.cc:507] Master 127.25.254.254:38955 requested a full tablet report, sending...
I20260504 14:07:56.931655 31314 ts_manager.cc:194] Registered new tserver with Master: 6b71a80afb2d4b50b13e3dc6b935099c (127.25.254.193:42145)
WARNING: no policy specified for kudu/127.25.254.194@KRBTEST.COM; defaulting to no policy
I20260504 14:07:56.932981 31314 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.193@KRBTEST.COM'} at 127.25.254.193:60959
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:56.987989 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:0
--local_ip_for_outbound_sockets=127.25.254.194
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:38955
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:38875
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--webserver_enabled=0 with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:57.100315 31515 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:57.100931 31515 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:57.101051 31515 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:57.104709 31515 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:57.104816 31515 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:57.104950 31515 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:07:57.109503 31515 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38875
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-1/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--webserver_enabled=false
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:38955
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.31515
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:57.110778 31515 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:57.111680 31515 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:57.118403 31520 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:57.118389 31523 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:57.118387 31521 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:57.118651 31515 server_base.cc:1061] running on GCE node
I20260504 14:07:57.119104 31515 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:57.119750 31515 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:57.120942 31515 hybrid_clock.cc:648] HybridClock initialized: now 1777903677120918 us; error 37 us; skew 500 ppm
May 04 14:07:57 dist-test-slave-2x32 krb5kdc[31269](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903677, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:57.123919 31515 init.cc:377] Logged in from keytab as kudu/127.25.254.194@KRBTEST.COM (short username kudu)
I20260504 14:07:57.124722 31515 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:57.124807 31515 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:57.125032 31515 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:57.126853 31515 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-1/data/instance:
uuid: "d38b4c7dfabb42bb82ec719169d3444c"
format_stamp: "Formatted at 2026-05-04 14:07:57 on dist-test-slave-2x32"
server_key: "3cdc1b949c55059d4796ae0649aaac66"
server_key_iv: "441f496a8d1ae09e94270d566720774b"
server_key_version: "encryptionkey@0"
I20260504 14:07:57.127343 31515 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance:
uuid: "d38b4c7dfabb42bb82ec719169d3444c"
format_stamp: "Formatted at 2026-05-04 14:07:57 on dist-test-slave-2x32"
server_key: "3cdc1b949c55059d4796ae0649aaac66"
server_key_iv: "441f496a8d1ae09e94270d566720774b"
server_key_version: "encryptionkey@0"
I20260504 14:07:57.130878 31515 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.004s	sys 0.000s
I20260504 14:07:57.133611 31529 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:57.134757 31515 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:07:57.134903 31515 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "d38b4c7dfabb42bb82ec719169d3444c"
format_stamp: "Formatted at 2026-05-04 14:07:57 on dist-test-slave-2x32"
server_key: "3cdc1b949c55059d4796ae0649aaac66"
server_key_iv: "441f496a8d1ae09e94270d566720774b"
server_key_version: "encryptionkey@0"
I20260504 14:07:57.135020 31515 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:57.144778 31515 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:57.147727 31515 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:57.147948 31515 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:57.148536 31515 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:57.149428 31515 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:57.149503 31515 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:57.149601 31515 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:57.149652 31515 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:57.160688 31515 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:44393
I20260504 14:07:57.160701 31642 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:44393 every 8 connection(s)
I20260504 14:07:57.161284 31515 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
I20260504 14:07:57.164705 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 31515
I20260504 14:07:57.164927 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance
I20260504 14:07:57.165202 26619 external_mini_cluster.cc:1468] Setting key 16f631beb67f2fb76dbc842c6380864c
May 04 14:07:57 dist-test-slave-2x32 krb5kdc[31269](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903677, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:57.177062 31355 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:57.163332 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:33941 (local address 127.25.254.254:38955)
0504 14:07:57.163477 (+   145us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:57.163481 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:57.164075 (+   594us) server_negotiation.cc:408] Connection header received
0504 14:07:57.165874 (+  1799us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:57.165878 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:57.165941 (+    63us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:57.166018 (+    77us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:57.167741 (+  1723us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.168534 (+   793us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:57.169425 (+   891us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.169607 (+   182us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:57.173210 (+  3603us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:57.173234 (+    24us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:57.173237 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:57.173269 (+    32us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:57.175017 (+  1748us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:57.175563 (+   546us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:57.175567 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:57.175569 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:57.175624 (+    55us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:57.175945 (+   321us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:57.175950 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:57.175952 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:57.176111 (+   159us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:57.176224 (+   113us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:57.176734 (+   510us) server_negotiation.cc:300] Negotiation successful
0504 14:07:57.176881 (+   147us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":45}
I20260504 14:07:57.177832 31645 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:57.163518 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:38955 (local address 127.25.254.194:33941)
0504 14:07:57.163932 (+   414us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:57.163966 (+    34us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:57.165618 (+  1652us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:57.166324 (+   706us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:57.166335 (+    11us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:57.166928 (+   593us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:57.167575 (+   647us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:57.167589 (+    14us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.168689 (+  1100us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:57.168692 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:57.169294 (+   602us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:57.169303 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.169832 (+   529us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:57.170875 (+  1043us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:57.170900 (+    25us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:57.173003 (+  2103us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:57.175168 (+  2165us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:57.175174 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:57.175187 (+    13us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:57.175443 (+   256us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:57.175722 (+   279us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:57.175726 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:57.175730 (+     4us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:57.175846 (+   116us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:57.176223 (+   377us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:57.176231 (+     8us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:57.176469 (+   238us) client_negotiation.cc:770] Sending connection context
0504 14:07:57.176686 (+   217us) client_negotiation.cc:241] Negotiation successful
0504 14:07:57.176928 (+   242us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":248,"thread_start_us":104,"threads_started":1}
I20260504 14:07:57.179275 31643 heartbeater.cc:344] Connected to a master server at 127.25.254.254:38955
I20260504 14:07:57.179580 31643 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:57.180087 31643 heartbeater.cc:507] Master 127.25.254.254:38955 requested a full tablet report, sending...
WARNING: no policy specified for kudu/127.25.254.195@KRBTEST.COM; defaulting to no policy
I20260504 14:07:57.181257 31314 ts_manager.cc:194] Registered new tserver with Master: d38b4c7dfabb42bb82ec719169d3444c (127.25.254.194:44393)
I20260504 14:07:57.181905 31314 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.194@KRBTEST.COM'} at 127.25.254.194:33941
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:07:57.231753 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:0
--local_ip_for_outbound_sockets=127.25.254.195
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:38955
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:38875
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--webserver_enabled=0 with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:07:57.343616 31650 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:57.343866 31650 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:57.343930 31650 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:57.347592 31650 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:57.347682 31650 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:57.347769 31650 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:07:57.352557 31650 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38875
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-2/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--webserver_enabled=false
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:38955
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.31650
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:57.353727 31650 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:57.354580 31650 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:57.361096 31658 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:57.361099 31656 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:57.361128 31655 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:57.361585 31650 server_base.cc:1061] running on GCE node
I20260504 14:07:57.362025 31650 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:57.362731 31650 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:57.363940 31650 hybrid_clock.cc:648] HybridClock initialized: now 1777903677363926 us; error 42 us; skew 500 ppm
May 04 14:07:57 dist-test-slave-2x32 krb5kdc[31269](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903677, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:07:57.367100 31650 init.cc:377] Logged in from keytab as kudu/127.25.254.195@KRBTEST.COM (short username kudu)
I20260504 14:07:57.367926 31650 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:57.367990 31650 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:57.368216 31650 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:57.370101 31650 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-2/data/instance:
uuid: "bbc0169188e74d67be47fb9bcfc349ba"
format_stamp: "Formatted at 2026-05-04 14:07:57 on dist-test-slave-2x32"
server_key: "d30ee21ef1315fe1d78b299d08214d1c"
server_key_iv: "c05994803d2ef08312e35919cf66e8b4"
server_key_version: "encryptionkey@0"
I20260504 14:07:57.370702 31650 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance:
uuid: "bbc0169188e74d67be47fb9bcfc349ba"
format_stamp: "Formatted at 2026-05-04 14:07:57 on dist-test-slave-2x32"
server_key: "d30ee21ef1315fe1d78b299d08214d1c"
server_key_iv: "c05994803d2ef08312e35919cf66e8b4"
server_key_version: "encryptionkey@0"
I20260504 14:07:57.374126 31650 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.003s	sys 0.000s
I20260504 14:07:57.376533 31664 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:57.377758 31650 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:07:57.378013 31650 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "bbc0169188e74d67be47fb9bcfc349ba"
format_stamp: "Formatted at 2026-05-04 14:07:57 on dist-test-slave-2x32"
server_key: "d30ee21ef1315fe1d78b299d08214d1c"
server_key_iv: "c05994803d2ef08312e35919cf66e8b4"
server_key_version: "encryptionkey@0"
I20260504 14:07:57.378134 31650 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:57.405035 31650 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:57.407932 31650 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:57.408159 31650 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:57.408767 31650 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:57.409710 31650 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:57.409793 31650 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:57.409866 31650 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:57.409904 31650 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:57.420207 31650 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:42309
I20260504 14:07:57.420219 31777 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:42309 every 8 connection(s)
I20260504 14:07:57.420959 31650 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
May 04 14:07:57 dist-test-slave-2x32 krb5kdc[31269](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903677, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:07:57.430866 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 31650
I20260504 14:07:57.431021 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance
I20260504 14:07:57.431357 26619 external_mini_cluster.cc:1468] Setting key f924c834db1b75cbfda103b7220b6736
I20260504 14:07:57.434760 31355 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:57.422737 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:59399 (local address 127.25.254.254:38955)
0504 14:07:57.422918 (+   181us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:57.422922 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:57.423599 (+   677us) server_negotiation.cc:408] Connection header received
0504 14:07:57.424931 (+  1332us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:57.424935 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:57.424987 (+    52us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:57.425072 (+    85us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:57.426507 (+  1435us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.427037 (+   530us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:57.427788 (+   751us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.427936 (+   148us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:57.430460 (+  2524us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:57.430482 (+    22us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:57.430484 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:57.430511 (+    27us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:57.432381 (+  1870us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:57.433095 (+   714us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:57.433099 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:57.433101 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:57.433155 (+    54us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:57.433497 (+   342us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:57.433501 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:57.433503 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:57.433787 (+   284us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:57.433898 (+   111us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:57.434492 (+   594us) server_negotiation.cc:300] Negotiation successful
0504 14:07:57.434635 (+   143us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":85}
I20260504 14:07:57.435370 31780 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:57.422999 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:38955 (local address 127.25.254.195:59399)
0504 14:07:57.423436 (+   437us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:57.423481 (+    45us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:57.424700 (+  1219us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:57.425280 (+   580us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:57.425289 (+     9us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:57.425776 (+   487us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:57.426338 (+   562us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:57.426351 (+    13us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.427206 (+   855us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:57.427210 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:57.427621 (+   411us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:57.427627 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.427810 (+   183us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:57.428460 (+   650us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:07:57.428479 (+    19us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:57.430294 (+  1815us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:57.432522 (+  2228us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:57.432530 (+     8us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:57.432561 (+    31us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:57.432952 (+   391us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:57.433276 (+   324us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:07:57.433279 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:07:57.433281 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:07:57.433394 (+   113us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:07:57.433945 (+   551us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:57.433951 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:57.434255 (+   304us) client_negotiation.cc:770] Sending connection context
0504 14:07:57.434439 (+   184us) client_negotiation.cc:241] Negotiation successful
0504 14:07:57.434638 (+   199us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":267,"thread_start_us":110,"threads_started":1}
I20260504 14:07:57.436436 31778 heartbeater.cc:344] Connected to a master server at 127.25.254.254:38955
I20260504 14:07:57.436699 31778 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:57.437206 31778 heartbeater.cc:507] Master 127.25.254.254:38955 requested a full tablet report, sending...
I20260504 14:07:57.438266 31314 ts_manager.cc:194] Registered new tserver with Master: bbc0169188e74d67be47fb9bcfc349ba (127.25.254.195:42309)
I20260504 14:07:57.438866 31314 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.195@KRBTEST.COM'} at 127.25.254.195:59399
I20260504 14:07:57.446478 26619 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
I20260504 14:07:57.456401 31355 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:57.448805 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:44344 (local address 127.25.254.254:38955)
0504 14:07:57.448957 (+   152us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:57.448961 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:57.449123 (+   162us) server_negotiation.cc:408] Connection header received
0504 14:07:57.449200 (+    77us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:57.449203 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:57.449253 (+    50us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:57.449331 (+    78us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:57.450322 (+   991us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.450838 (+   516us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:57.451702 (+   864us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.451857 (+   155us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:57.453109 (+  1252us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:57.453129 (+    20us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:57.453131 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:07:57.453160 (+    29us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:57.454650 (+  1490us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:57.455148 (+   498us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:57.455152 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:57.455153 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:57.455206 (+    53us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:07:57.455483 (+   277us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:07:57.455486 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:07:57.455488 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:07:57.455737 (+   249us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:57.455890 (+   153us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:57.456164 (+   274us) server_negotiation.cc:300] Negotiation successful
0504 14:07:57.456270 (+   106us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":51}
I20260504 14:07:57.459954 31314 catalog_manager.cc:2257] Servicing CreateTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:44344:
name: "test-table"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "val"
    type: INT32
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
W20260504 14:07:57.462528 31314 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-table in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260504 14:07:57.477731 31789 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:57.472622 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:42145 (local address 127.0.0.1:33784)
0504 14:07:57.473385 (+   763us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:57.473425 (+    40us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:57.473593 (+   168us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:57.474444 (+   851us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:57.474450 (+     6us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:57.474475 (+    25us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:57.474865 (+   390us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:57.474880 (+    15us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.476157 (+  1277us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:57.476161 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:57.477102 (+   941us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:57.477111 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.477234 (+   123us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:57.477306 (+    72us) client_negotiation.cc:770] Sending connection context
0504 14:07:57.477396 (+    90us) client_negotiation.cc:241] Negotiation successful
0504 14:07:57.477486 (+    90us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":581,"thread_start_us":109,"threads_started":1}
I20260504 14:07:57.478065 31792 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:57.472781 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:42309 (local address 127.0.0.1:53550)
0504 14:07:57.473650 (+   869us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:57.473666 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:57.473759 (+    93us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:57.474669 (+   910us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:57.474672 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:57.474693 (+    21us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:57.474934 (+   241us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:57.474945 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.476628 (+  1683us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:57.476633 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:57.477478 (+   845us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:57.477485 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.477604 (+   119us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:57.477618 (+    14us) client_negotiation.cc:770] Sending connection context
0504 14:07:57.477666 (+    48us) client_negotiation.cc:241] Negotiation successful
0504 14:07:57.477714 (+    48us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":789,"thread_start_us":115,"threads_started":1}
I20260504 14:07:57.478703 31793 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:57.473412 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:53550 (local address 127.25.254.195:42309)
0504 14:07:57.474237 (+   825us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:57.474242 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:57.474263 (+    21us) server_negotiation.cc:408] Connection header received
0504 14:07:57.474340 (+    77us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:57.474344 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:57.474508 (+   164us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:57.474667 (+   159us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:57.475074 (+   407us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.476475 (+  1401us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:57.477641 (+  1166us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.478265 (+   624us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:57.478398 (+   133us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:57.478467 (+    69us) server_negotiation.cc:300] Negotiation successful
0504 14:07:57.478556 (+    89us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":714,"thread_start_us":139,"threads_started":1}
I20260504 14:07:57.479069 31791 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:57.473369 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:44393 (local address 127.0.0.1:47710)
0504 14:07:57.474362 (+   993us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:57.474378 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:57.474456 (+    78us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:57.474999 (+   543us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:57.475003 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:57.475027 (+    24us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:57.475267 (+   240us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:57.475274 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.477154 (+  1880us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:57.477160 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:57.478659 (+  1499us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:57.478670 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.478809 (+   139us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:57.478824 (+    15us) client_negotiation.cc:770] Sending connection context
0504 14:07:57.478880 (+    56us) client_negotiation.cc:241] Negotiation successful
0504 14:07:57.478944 (+    64us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":898,"thread_start_us":118,"threads_started":1}
I20260504 14:07:57.480014 31790 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:57.472788 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:33784 (local address 127.25.254.193:42145)
0504 14:07:57.473895 (+  1107us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:57.473903 (+     8us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:57.473925 (+    22us) server_negotiation.cc:408] Connection header received
0504 14:07:57.474008 (+    83us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:57.474016 (+     8us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:57.474259 (+   243us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:57.474433 (+   174us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:57.475065 (+   632us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.475989 (+   924us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:57.477641 (+  1652us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.479400 (+  1759us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:57.479490 (+    90us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:57.479571 (+    81us) server_negotiation.cc:300] Negotiation successful
0504 14:07:57.479685 (+   114us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":989,"thread_start_us":58,"threads_started":1}
I20260504 14:07:57.480703 31794 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:57.473790 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:47710 (local address 127.25.254.194:44393)
0504 14:07:57.474121 (+   331us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:57.474129 (+     8us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:57.474544 (+   415us) server_negotiation.cc:408] Connection header received
0504 14:07:57.474667 (+   123us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:57.474672 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:57.474829 (+   157us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:57.474974 (+   145us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:57.475427 (+   453us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.477010 (+  1583us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:57.479696 (+  2686us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.480278 (+   582us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:57.480403 (+   125us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:57.480475 (+    72us) server_negotiation.cc:300] Negotiation successful
0504 14:07:57.480563 (+    88us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":233,"thread_start_us":143,"threads_started":1}
I20260504 14:07:57.482473 31712 tablet_service.cc:1511] Processing CreateTablet for tablet 08f357037edd4558bd31dcd7fdc918df (DEFAULT_TABLE table=test-table [id=6234d26a13a448ccb0917e2999227566]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:57.483012 31442 tablet_service.cc:1511] Processing CreateTablet for tablet 08f357037edd4558bd31dcd7fdc918df (DEFAULT_TABLE table=test-table [id=6234d26a13a448ccb0917e2999227566]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:57.483412 31574 tablet_service.cc:1511] Processing CreateTablet for tablet 08f357037edd4558bd31dcd7fdc918df (DEFAULT_TABLE table=test-table [id=6234d26a13a448ccb0917e2999227566]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:57.483897 31712 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 08f357037edd4558bd31dcd7fdc918df. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:57.484175 31442 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 08f357037edd4558bd31dcd7fdc918df. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:57.484472 31574 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 08f357037edd4558bd31dcd7fdc918df. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:57.491192 31795 tablet_bootstrap.cc:492] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c: Bootstrap starting.
I20260504 14:07:57.492082 31797 tablet_bootstrap.cc:492] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba: Bootstrap starting.
I20260504 14:07:57.492081 31796 tablet_bootstrap.cc:492] T 08f357037edd4558bd31dcd7fdc918df P 6b71a80afb2d4b50b13e3dc6b935099c: Bootstrap starting.
I20260504 14:07:57.493722 31795 tablet_bootstrap.cc:654] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:57.494132 31796 tablet_bootstrap.cc:654] T 08f357037edd4558bd31dcd7fdc918df P 6b71a80afb2d4b50b13e3dc6b935099c: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:57.494310 31797 tablet_bootstrap.cc:654] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:57.494930 31796 log.cc:826] T 08f357037edd4558bd31dcd7fdc918df P 6b71a80afb2d4b50b13e3dc6b935099c: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:57.494921 31795 log.cc:826] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:57.495031 31797 log.cc:826] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:57.497198 31797 tablet_bootstrap.cc:492] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba: No bootstrap required, opened a new log
I20260504 14:07:57.497200 31795 tablet_bootstrap.cc:492] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c: No bootstrap required, opened a new log
I20260504 14:07:57.497200 31796 tablet_bootstrap.cc:492] T 08f357037edd4558bd31dcd7fdc918df P 6b71a80afb2d4b50b13e3dc6b935099c: No bootstrap required, opened a new log
I20260504 14:07:57.497435 31797 ts_tablet_manager.cc:1403] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba: Time spent bootstrapping tablet: real 0.006s	user 0.000s	sys 0.004s
I20260504 14:07:57.497437 31796 ts_tablet_manager.cc:1403] T 08f357037edd4558bd31dcd7fdc918df P 6b71a80afb2d4b50b13e3dc6b935099c: Time spent bootstrapping tablet: real 0.006s	user 0.004s	sys 0.000s
I20260504 14:07:57.497437 31795 ts_tablet_manager.cc:1403] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c: Time spent bootstrapping tablet: real 0.006s	user 0.004s	sys 0.000s
I20260504 14:07:57.500715 31797 raft_consensus.cc:359] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "6b71a80afb2d4b50b13e3dc6b935099c" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42145 } } peers { permanent_uuid: "bbc0169188e74d67be47fb9bcfc349ba" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 42309 } } peers { permanent_uuid: "d38b4c7dfabb42bb82ec719169d3444c" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44393 } }
I20260504 14:07:57.500768 31796 raft_consensus.cc:359] T 08f357037edd4558bd31dcd7fdc918df P 6b71a80afb2d4b50b13e3dc6b935099c [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "6b71a80afb2d4b50b13e3dc6b935099c" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42145 } } peers { permanent_uuid: "bbc0169188e74d67be47fb9bcfc349ba" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 42309 } } peers { permanent_uuid: "d38b4c7dfabb42bb82ec719169d3444c" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44393 } }
I20260504 14:07:57.500995 31796 raft_consensus.cc:385] T 08f357037edd4558bd31dcd7fdc918df P 6b71a80afb2d4b50b13e3dc6b935099c [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:57.500998 31797 raft_consensus.cc:385] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:57.501068 31797 raft_consensus.cc:740] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: bbc0169188e74d67be47fb9bcfc349ba, State: Initialized, Role: FOLLOWER
I20260504 14:07:57.501060 31796 raft_consensus.cc:740] T 08f357037edd4558bd31dcd7fdc918df P 6b71a80afb2d4b50b13e3dc6b935099c [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 6b71a80afb2d4b50b13e3dc6b935099c, State: Initialized, Role: FOLLOWER
I20260504 14:07:57.501060 31795 raft_consensus.cc:359] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "6b71a80afb2d4b50b13e3dc6b935099c" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42145 } } peers { permanent_uuid: "bbc0169188e74d67be47fb9bcfc349ba" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 42309 } } peers { permanent_uuid: "d38b4c7dfabb42bb82ec719169d3444c" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44393 } }
I20260504 14:07:57.501255 31795 raft_consensus.cc:385] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:57.501317 31795 raft_consensus.cc:740] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: d38b4c7dfabb42bb82ec719169d3444c, State: Initialized, Role: FOLLOWER
I20260504 14:07:57.501629 31797 consensus_queue.cc:260] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "6b71a80afb2d4b50b13e3dc6b935099c" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42145 } } peers { permanent_uuid: "bbc0169188e74d67be47fb9bcfc349ba" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 42309 } } peers { permanent_uuid: "d38b4c7dfabb42bb82ec719169d3444c" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44393 } }
I20260504 14:07:57.501648 31795 consensus_queue.cc:260] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "6b71a80afb2d4b50b13e3dc6b935099c" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42145 } } peers { permanent_uuid: "bbc0169188e74d67be47fb9bcfc349ba" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 42309 } } peers { permanent_uuid: "d38b4c7dfabb42bb82ec719169d3444c" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44393 } }
I20260504 14:07:57.501629 31796 consensus_queue.cc:260] T 08f357037edd4558bd31dcd7fdc918df P 6b71a80afb2d4b50b13e3dc6b935099c [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "6b71a80afb2d4b50b13e3dc6b935099c" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42145 } } peers { permanent_uuid: "bbc0169188e74d67be47fb9bcfc349ba" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 42309 } } peers { permanent_uuid: "d38b4c7dfabb42bb82ec719169d3444c" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44393 } }
I20260504 14:07:57.502480 31643 heartbeater.cc:499] Master 127.25.254.254:38955 was elected leader, sending a full tablet report...
I20260504 14:07:57.502717 31795 ts_tablet_manager.cc:1434] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c: Time spent starting tablet: real 0.005s	user 0.006s	sys 0.000s
I20260504 14:07:57.502924 31508 heartbeater.cc:499] Master 127.25.254.254:38955 was elected leader, sending a full tablet report...
I20260504 14:07:57.503512 31796 ts_tablet_manager.cc:1434] T 08f357037edd4558bd31dcd7fdc918df P 6b71a80afb2d4b50b13e3dc6b935099c: Time spent starting tablet: real 0.006s	user 0.006s	sys 0.000s
I20260504 14:07:57.504037 31797 ts_tablet_manager.cc:1434] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba: Time spent starting tablet: real 0.007s	user 0.005s	sys 0.000s
I20260504 14:07:57.504446 31778 heartbeater.cc:499] Master 127.25.254.254:38955 was elected leader, sending a full tablet report...
I20260504 14:07:57.549827 31803 raft_consensus.cc:493] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:07:57.550024 31803 raft_consensus.cc:515] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "6b71a80afb2d4b50b13e3dc6b935099c" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42145 } } peers { permanent_uuid: "bbc0169188e74d67be47fb9bcfc349ba" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 42309 } } peers { permanent_uuid: "d38b4c7dfabb42bb82ec719169d3444c" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44393 } }
I20260504 14:07:57.551216 31803 leader_election.cc:290] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 6b71a80afb2d4b50b13e3dc6b935099c (127.25.254.193:42145), d38b4c7dfabb42bb82ec719169d3444c (127.25.254.194:44393)
I20260504 14:07:57.554916 31780 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:57.551497 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:42145 (local address 127.25.254.195:33103)
0504 14:07:57.551641 (+   144us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:57.551657 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:57.551755 (+    98us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:57.552127 (+   372us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:57.552131 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:57.552155 (+    24us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:57.552410 (+   255us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:57.552419 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.553815 (+  1396us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:57.553819 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:57.554571 (+   752us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:57.554579 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.554672 (+    93us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:57.554686 (+    14us) client_negotiation.cc:770] Sending connection context
0504 14:07:57.554723 (+    37us) client_negotiation.cc:241] Negotiation successful
0504 14:07:57.554779 (+    56us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":53}
I20260504 14:07:57.555424 31804 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:57.551800 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:44393 (local address 127.25.254.195:33549)
0504 14:07:57.552254 (+   454us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:57.552269 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:57.552385 (+   116us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:57.552862 (+   477us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:57.552867 (+     5us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:57.552894 (+    27us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:07:57.553221 (+   327us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:57.553228 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.554082 (+   854us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:57.554085 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:57.554793 (+   708us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:57.554803 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.554921 (+   118us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:57.554937 (+    16us) client_negotiation.cc:770] Sending connection context
0504 14:07:57.555212 (+   275us) client_negotiation.cc:241] Negotiation successful
0504 14:07:57.555268 (+    56us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":377,"thread_start_us":157,"threads_started":1}
I20260504 14:07:57.555461 31790 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:57.551584 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:33103 (local address 127.25.254.193:42145)
0504 14:07:57.551753 (+   169us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:57.551758 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:57.551780 (+    22us) server_negotiation.cc:408] Connection header received
0504 14:07:57.551845 (+    65us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:57.551850 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:57.551907 (+    57us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:57.552004 (+    97us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:57.552555 (+   551us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.553664 (+  1109us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:57.554710 (+  1046us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.555208 (+   498us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:57.555246 (+    38us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:57.555296 (+    50us) server_negotiation.cc:300] Negotiation successful
0504 14:07:57.555352 (+    56us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":58}
I20260504 14:07:57.555670 31794 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:57.551950 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:33549 (local address 127.25.254.194:44393)
0504 14:07:57.552081 (+   131us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:57.552087 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:57.552332 (+   245us) server_negotiation.cc:408] Connection header received
0504 14:07:57.552574 (+   242us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:57.552578 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:57.552638 (+    60us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:57.552749 (+   111us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:07:57.553338 (+   589us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.553958 (+   620us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:57.554935 (+   977us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.555371 (+   436us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:57.555400 (+    29us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:57.555445 (+    45us) server_negotiation.cc:300] Negotiation successful
0504 14:07:57.555529 (+    84us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":52}
I20260504 14:07:57.556074 31462 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "08f357037edd4558bd31dcd7fdc918df" candidate_uuid: "bbc0169188e74d67be47fb9bcfc349ba" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "6b71a80afb2d4b50b13e3dc6b935099c" is_pre_election: true
I20260504 14:07:57.556257 31597 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "08f357037edd4558bd31dcd7fdc918df" candidate_uuid: "bbc0169188e74d67be47fb9bcfc349ba" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d38b4c7dfabb42bb82ec719169d3444c" is_pre_election: true
I20260504 14:07:57.556363 31462 raft_consensus.cc:2468] T 08f357037edd4558bd31dcd7fdc918df P 6b71a80afb2d4b50b13e3dc6b935099c [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate bbc0169188e74d67be47fb9bcfc349ba in term 0.
I20260504 14:07:57.556519 31597 raft_consensus.cc:2468] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate bbc0169188e74d67be47fb9bcfc349ba in term 0.
I20260504 14:07:57.556923 31666 leader_election.cc:304] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 6b71a80afb2d4b50b13e3dc6b935099c, bbc0169188e74d67be47fb9bcfc349ba; no voters: 
I20260504 14:07:57.557166 31803 raft_consensus.cc:2804] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:07:57.557240 31803 raft_consensus.cc:493] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:07:57.557282 31803 raft_consensus.cc:3060] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:57.558285 31803 raft_consensus.cc:515] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "6b71a80afb2d4b50b13e3dc6b935099c" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42145 } } peers { permanent_uuid: "bbc0169188e74d67be47fb9bcfc349ba" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 42309 } } peers { permanent_uuid: "d38b4c7dfabb42bb82ec719169d3444c" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44393 } }
I20260504 14:07:57.558667 31803 leader_election.cc:290] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [CANDIDATE]: Term 1 election: Requested vote from peers 6b71a80afb2d4b50b13e3dc6b935099c (127.25.254.193:42145), d38b4c7dfabb42bb82ec719169d3444c (127.25.254.194:44393)
I20260504 14:07:57.559108 31462 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "08f357037edd4558bd31dcd7fdc918df" candidate_uuid: "bbc0169188e74d67be47fb9bcfc349ba" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "6b71a80afb2d4b50b13e3dc6b935099c"
I20260504 14:07:57.559124 31597 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "08f357037edd4558bd31dcd7fdc918df" candidate_uuid: "bbc0169188e74d67be47fb9bcfc349ba" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d38b4c7dfabb42bb82ec719169d3444c"
I20260504 14:07:57.559216 31462 raft_consensus.cc:3060] T 08f357037edd4558bd31dcd7fdc918df P 6b71a80afb2d4b50b13e3dc6b935099c [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:57.559265 31597 raft_consensus.cc:3060] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:57.560261 31462 raft_consensus.cc:2468] T 08f357037edd4558bd31dcd7fdc918df P 6b71a80afb2d4b50b13e3dc6b935099c [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate bbc0169188e74d67be47fb9bcfc349ba in term 1.
I20260504 14:07:57.560531 31597 raft_consensus.cc:2468] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate bbc0169188e74d67be47fb9bcfc349ba in term 1.
I20260504 14:07:57.560647 31666 leader_election.cc:304] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 6b71a80afb2d4b50b13e3dc6b935099c, bbc0169188e74d67be47fb9bcfc349ba; no voters: 
I20260504 14:07:57.560976 31803 raft_consensus.cc:2804] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:57.561213 31803 raft_consensus.cc:697] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [term 1 LEADER]: Becoming Leader. State: Replica: bbc0169188e74d67be47fb9bcfc349ba, State: Running, Role: LEADER
I20260504 14:07:57.561545 31803 consensus_queue.cc:237] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "6b71a80afb2d4b50b13e3dc6b935099c" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42145 } } peers { permanent_uuid: "bbc0169188e74d67be47fb9bcfc349ba" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 42309 } } peers { permanent_uuid: "d38b4c7dfabb42bb82ec719169d3444c" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44393 } }
I20260504 14:07:57.564783 31315 catalog_manager.cc:5671] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba reported cstate change: term changed from 0 to 1, leader changed from <none> to bbc0169188e74d67be47fb9bcfc349ba (127.25.254.195). New cstate: current_term: 1 leader_uuid: "bbc0169188e74d67be47fb9bcfc349ba" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "6b71a80afb2d4b50b13e3dc6b935099c" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42145 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "bbc0169188e74d67be47fb9bcfc349ba" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 42309 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "d38b4c7dfabb42bb82ec719169d3444c" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44393 } health_report { overall_health: UNKNOWN } } }
I20260504 14:07:57.581581 31793 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:57.578281 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:53562 (local address 127.25.254.195:42309)
0504 14:07:57.578420 (+   139us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:57.578423 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:57.578437 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:07:57.578607 (+   170us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:57.578610 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:57.578665 (+    55us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:57.578763 (+    98us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:07:57.579274 (+   511us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.579776 (+   502us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:57.580455 (+   679us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:57.580608 (+   153us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:57.580691 (+    83us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:07:57.581075 (+   384us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:07:57.581175 (+   100us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:57.581401 (+   226us) server_negotiation.cc:300] Negotiation successful
0504 14:07:57.581450 (+    49us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":54}
I20260504 14:07:57.588162 31597 raft_consensus.cc:1275] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c [term 1 FOLLOWER]: Refusing update from remote peer bbc0169188e74d67be47fb9bcfc349ba: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:07:57.588162 31462 raft_consensus.cc:1275] T 08f357037edd4558bd31dcd7fdc918df P 6b71a80afb2d4b50b13e3dc6b935099c [term 1 FOLLOWER]: Refusing update from remote peer bbc0169188e74d67be47fb9bcfc349ba: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:07:57.588933 31805 consensus_queue.cc:1048] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [LEADER]: Connected to new peer: Peer: permanent_uuid: "d38b4c7dfabb42bb82ec719169d3444c" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44393 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:07:57.589145 31803 consensus_queue.cc:1048] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [LEADER]: Connected to new peer: Peer: permanent_uuid: "6b71a80afb2d4b50b13e3dc6b935099c" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 42145 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:07:57.599102 31809 mvcc.cc:204] Tried to move back new op lower bound from 7282293463392432128 to 7282293463295455232. Current Snapshot: MvccSnapshot[applied={T|T < 7282293463392432128}]
I20260504 14:07:57.606297 31316 catalog_manager.cc:2507] Servicing SoftDeleteTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:44344:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:07:57.606451 31316 catalog_manager.cc:2755] Servicing DeleteTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:44344:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:07:57.609048 31316 catalog_manager.cc:5958] T 00000000000000000000000000000000 P f0520c9b48f64804bfdda570ef59f3ec: Sending DeleteTablet for 3 replicas of tablet 08f357037edd4558bd31dcd7fdc918df
I20260504 14:07:57.609864 31712 tablet_service.cc:1558] Processing DeleteTablet for tablet 08f357037edd4558bd31dcd7fdc918df with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:57 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:53550
I20260504 14:07:57.609926 31574 tablet_service.cc:1558] Processing DeleteTablet for tablet 08f357037edd4558bd31dcd7fdc918df with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:57 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:47710
I20260504 14:07:57.609964 31442 tablet_service.cc:1558] Processing DeleteTablet for tablet 08f357037edd4558bd31dcd7fdc918df with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:57 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:33784
I20260504 14:07:57.610431 31818 tablet_replica.cc:333] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c: stopping tablet replica
I20260504 14:07:57.610546 31820 tablet_replica.cc:333] T 08f357037edd4558bd31dcd7fdc918df P 6b71a80afb2d4b50b13e3dc6b935099c: stopping tablet replica
I20260504 14:07:57.610771 31818 raft_consensus.cc:2243] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:07:57.610844 31820 raft_consensus.cc:2243] T 08f357037edd4558bd31dcd7fdc918df P 6b71a80afb2d4b50b13e3dc6b935099c [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:07:57.611060 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 31380
I20260504 14:07:57.611122 31818 raft_consensus.cc:2272] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:07:57.611616 31819 tablet_replica.cc:333] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba: stopping tablet replica
I20260504 14:07:57.611871 31819 raft_consensus.cc:2243] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [term 1 LEADER]: Raft consensus shutting down.
I20260504 14:07:57.612246 31819 raft_consensus.cc:2272] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:07:57.614668 31819 ts_tablet_manager.cc:1916] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:07:57.615242 31818 ts_tablet_manager.cc:1916] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:07:57.617637 31818 ts_tablet_manager.cc:1929] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:07:57.617735 31818 log.cc:1199] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-1/wal/wals/08f357037edd4558bd31dcd7fdc918df
I20260504 14:07:57.618091 31818 ts_tablet_manager.cc:1950] T 08f357037edd4558bd31dcd7fdc918df P d38b4c7dfabb42bb82ec719169d3444c: Deleting consensus metadata
I20260504 14:07:57.618268 31819 ts_tablet_manager.cc:1929] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:07:57.618427 31819 log.cc:1199] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableWebUI.1777903638260922-26619-0/minicluster-data/ts-2/wal/wals/08f357037edd4558bd31dcd7fdc918df
W20260504 14:07:57.618588 31301 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv got EOF from 127.25.254.193:42145 (error 108)
I20260504 14:07:57.618901 31819 ts_tablet_manager.cc:1950] T 08f357037edd4558bd31dcd7fdc918df P bbc0169188e74d67be47fb9bcfc349ba: Deleting consensus metadata
I20260504 14:07:57.619356 31301 catalog_manager.cc:5002] TS d38b4c7dfabb42bb82ec719169d3444c (127.25.254.194:44393): tablet 08f357037edd4558bd31dcd7fdc918df (table test-table [id=6234d26a13a448ccb0917e2999227566]) successfully deleted
I20260504 14:07:57.619716 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 31515
I20260504 14:07:57.620509 31300 catalog_manager.cc:5002] TS bbc0169188e74d67be47fb9bcfc349ba (127.25.254.195:42309): tablet 08f357037edd4558bd31dcd7fdc918df (table test-table [id=6234d26a13a448ccb0917e2999227566]) successfully deleted
I20260504 14:07:57.621963 31791 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:57.621685 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:42145 (local address 127.0.0.1:33788)
0504 14:07:57.621827 (+   142us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:57.621888 (+    61us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.193:42145: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":52}
W20260504 14:07:57.622200 31301 catalog_manager.cc:4729] TS 6b71a80afb2d4b50b13e3dc6b935099c (127.25.254.193:42145): DeleteTablet:TABLET_DATA_DELETED RPC failed for tablet 08f357037edd4558bd31dcd7fdc918df: Network error: Client connection negotiation failed: client connection to 127.25.254.193:42145: connect: Connection refused (error 111)
I20260504 14:07:57.627413 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 31650
I20260504 14:07:57.633256 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 31285
2026-05-04T14:07:57Z chronyd exiting
[       OK ] SecurityITest.TestDisableWebUI (3491 ms)
[ RUN      ] SecurityITest.TestDisableAuthenticationEncryption
2026-05-04T14:07:57Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:07:57Z Disabled control of system clock
I20260504 14:07:57.663902 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:46647
--webserver_interface=127.25.254.254
--webserver_port=0
--builtin_ntp_servers=127.25.254.212:38925
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:46647
--encrypt_data_at_rest=true
--rpc_trace_negotiation
--rpc_authentication=disabled
--rpc_encryption=disabled with env {}
W20260504 14:07:57.768743 31825 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:57.768967 31825 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:57.769014 31825 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:57.772442 31825 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:07:57.772518 31825 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:57.772536 31825 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:57.772555 31825 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:07:57.772573 31825 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:07:57.776597 31825 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38925
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:46647
--rpc_trace_negotiation=true
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:46647
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=disabled
--rpc_encryption=disabled
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.31825
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:57.777643 31825 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:57.778795 31825 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:57.784898 31833 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:57.784977 31825 server_base.cc:1061] running on GCE node
W20260504 14:07:57.784894 31830 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:57.784893 31831 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:57.785768 31825 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:57.786818 31825 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:57.788007 31825 hybrid_clock.cc:648] HybridClock initialized: now 1777903677787977 us; error 45 us; skew 500 ppm
I20260504 14:07:57.790076 31825 webserver.cc:492] Webserver started at http://127.25.254.254:33903/ using document root <none> and password file <none>
I20260504 14:07:57.790697 31825 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:57.790766 31825 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:57.790987 31825 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:57.792663 31825 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "4b9062ad5cbf4337b13cde76e6d6adbc"
format_stamp: "Formatted at 2026-05-04 14:07:57 on dist-test-slave-2x32"
server_key: "b95f40bf028eb3ab66d8976fe935389d"
server_key_iv: "1448b54b1f899dabf5d5a17611a92cd7"
server_key_version: "encryptionkey@0"
I20260504 14:07:57.793175 31825 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "4b9062ad5cbf4337b13cde76e6d6adbc"
format_stamp: "Formatted at 2026-05-04 14:07:57 on dist-test-slave-2x32"
server_key: "b95f40bf028eb3ab66d8976fe935389d"
server_key_iv: "1448b54b1f899dabf5d5a17611a92cd7"
server_key_version: "encryptionkey@0"
I20260504 14:07:57.797101 31825 fs_manager.cc:696] Time spent creating directory manager: real 0.004s	user 0.002s	sys 0.004s
I20260504 14:07:57.799690 31839 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:57.800988 31825 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.001s	sys 0.000s
I20260504 14:07:57.801142 31825 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "4b9062ad5cbf4337b13cde76e6d6adbc"
format_stamp: "Formatted at 2026-05-04 14:07:57 on dist-test-slave-2x32"
server_key: "b95f40bf028eb3ab66d8976fe935389d"
server_key_iv: "1448b54b1f899dabf5d5a17611a92cd7"
server_key_version: "encryptionkey@0"
I20260504 14:07:57.801266 31825 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:57.806001 31825 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:57.806615 31825 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:57.806808 31825 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:57.814527 31825 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:46647
I20260504 14:07:57.814533 31891 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:46647 every 8 connection(s)
I20260504 14:07:57.815585 31825 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:07:57.818629 31892 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:57.818994 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 31825
I20260504 14:07:57.819093 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:07:57.819365 26619 external_mini_cluster.cc:1468] Setting key 93756a9528a499814cf2bd45c31f12b7
I20260504 14:07:57.825285 31895 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:57.820778 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:37018 (local address 127.25.254.254:46647)
0504 14:07:57.821443 (+   665us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:57.821456 (+    13us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:57.821499 (+    43us) server_negotiation.cc:408] Connection header received
0504 14:07:57.822342 (+   843us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:57.822368 (+    26us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:57.822784 (+   416us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:57.823154 (+   370us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:57.824054 (+   900us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:57.824060 (+     6us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:57.824074 (+    14us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:07:57.824086 (+    12us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:57.824130 (+    44us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:07:57.824144 (+    14us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:57.824306 (+   162us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:57.824397 (+    91us) server_negotiation.cc:300] Negotiation successful
0504 14:07:57.824532 (+   135us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":279,"thread_start_us":107,"threads_started":1}
I20260504 14:07:57.825824 31892 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc: Bootstrap starting.
I20260504 14:07:57.828236 31892 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:57.828975 31892 log.cc:826] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:57.831002 31892 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc: No bootstrap required, opened a new log
I20260504 14:07:57.833698 31892 raft_consensus.cc:359] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4b9062ad5cbf4337b13cde76e6d6adbc" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 46647 } }
I20260504 14:07:57.833946 31892 raft_consensus.cc:385] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:57.834038 31892 raft_consensus.cc:740] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4b9062ad5cbf4337b13cde76e6d6adbc, State: Initialized, Role: FOLLOWER
I20260504 14:07:57.834496 31892 consensus_queue.cc:260] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4b9062ad5cbf4337b13cde76e6d6adbc" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 46647 } }
I20260504 14:07:57.834649 31892 raft_consensus.cc:399] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:07:57.834739 31892 raft_consensus.cc:493] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:07:57.834865 31892 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:57.835762 31892 raft_consensus.cc:515] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4b9062ad5cbf4337b13cde76e6d6adbc" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 46647 } }
I20260504 14:07:57.836128 31892 leader_election.cc:304] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 4b9062ad5cbf4337b13cde76e6d6adbc; no voters: 
I20260504 14:07:57.836442 31892 leader_election.cc:290] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:07:57.836637 31897 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:57.836946 31897 raft_consensus.cc:697] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc [term 1 LEADER]: Becoming Leader. State: Replica: 4b9062ad5cbf4337b13cde76e6d6adbc, State: Running, Role: LEADER
I20260504 14:07:57.837276 31897 consensus_queue.cc:237] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4b9062ad5cbf4337b13cde76e6d6adbc" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 46647 } }
I20260504 14:07:57.837709 31892 sys_catalog.cc:565] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:07:57.838944 31899 sys_catalog.cc:455] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc [sys.catalog]: SysCatalogTable state changed. Reason: New leader 4b9062ad5cbf4337b13cde76e6d6adbc. Latest consensus state: current_term: 1 leader_uuid: "4b9062ad5cbf4337b13cde76e6d6adbc" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4b9062ad5cbf4337b13cde76e6d6adbc" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 46647 } } }
I20260504 14:07:57.839059 31899 sys_catalog.cc:458] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:57.839511 31898 sys_catalog.cc:455] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "4b9062ad5cbf4337b13cde76e6d6adbc" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4b9062ad5cbf4337b13cde76e6d6adbc" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 46647 } } }
I20260504 14:07:57.839665 31898 sys_catalog.cc:458] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:57.840034 31906 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:07:57.842903 31906 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:07:57.848104 31906 catalog_manager.cc:1357] Generated new cluster ID: 7a007409f39f4cbaa29d338878f093cf
I20260504 14:07:57.848186 31906 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:07:57.868943 31906 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:07:57.869311 31906 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:07:57.876281 31906 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc: Generated new TSK 0
I20260504 14:07:57.876996 31906 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20260504 14:07:57.884222 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:0
--local_ip_for_outbound_sockets=127.25.254.193
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:46647
--builtin_ntp_servers=127.25.254.212:38925
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--rpc_authentication=disabled
--rpc_encryption=disabled with env {}
W20260504 14:07:57.989713 31916 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:57.989949 31916 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:57.990037 31916 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:57.993599 31916 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:57.993673 31916 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:57.993753 31916 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:07:57.998219 31916 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38925
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-0/wal
--rpc_trace_negotiation=true
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=disabled
--rpc_encryption=disabled
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:46647
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.31916
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:57.999388 31916 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:58.000542 31916 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:58.007189 31921 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:58.007318 31924 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:58.007210 31922 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:58.007587 31916 server_base.cc:1061] running on GCE node
I20260504 14:07:58.008004 31916 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:58.008555 31916 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:58.009749 31916 hybrid_clock.cc:648] HybridClock initialized: now 1777903678009726 us; error 35 us; skew 500 ppm
I20260504 14:07:58.011592 31916 webserver.cc:492] Webserver started at http://127.25.254.193:42113/ using document root <none> and password file <none>
I20260504 14:07:58.012156 31916 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:58.012240 31916 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:58.012485 31916 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:58.014200 31916 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-0/data/instance:
uuid: "9cda006d51c14c1cbc67a116171694a9"
format_stamp: "Formatted at 2026-05-04 14:07:58 on dist-test-slave-2x32"
server_key: "2f6c03c9d8e3588960c0ef0260c43135"
server_key_iv: "b0de8e27a3643589ca4e201d53d2ff1e"
server_key_version: "encryptionkey@0"
I20260504 14:07:58.014698 31916 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance:
uuid: "9cda006d51c14c1cbc67a116171694a9"
format_stamp: "Formatted at 2026-05-04 14:07:58 on dist-test-slave-2x32"
server_key: "2f6c03c9d8e3588960c0ef0260c43135"
server_key_iv: "b0de8e27a3643589ca4e201d53d2ff1e"
server_key_version: "encryptionkey@0"
I20260504 14:07:58.017907 31916 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:07:58.020169 31930 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:58.021240 31916 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:07:58.021373 31916 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "9cda006d51c14c1cbc67a116171694a9"
format_stamp: "Formatted at 2026-05-04 14:07:58 on dist-test-slave-2x32"
server_key: "2f6c03c9d8e3588960c0ef0260c43135"
server_key_iv: "b0de8e27a3643589ca4e201d53d2ff1e"
server_key_version: "encryptionkey@0"
I20260504 14:07:58.021474 31916 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:58.026613 31916 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:58.027241 31916 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:58.027433 31916 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:58.027983 31916 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:58.028957 31916 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:58.029028 31916 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:58.029098 31916 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:58.029148 31916 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:58.038601 31916 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:36105
I20260504 14:07:58.038807 32043 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:36105 every 8 connection(s)
I20260504 14:07:58.039700 31916 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
I20260504 14:07:58.040205 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 31916
I20260504 14:07:58.040325 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance
I20260504 14:07:58.040707 26619 external_mini_cluster.cc:1468] Setting key 054629e3f2c972a34aeac5284aee1b1f
I20260504 14:07:58.043042 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:0
--local_ip_for_outbound_sockets=127.25.254.194
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:46647
--builtin_ntp_servers=127.25.254.212:38925
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--rpc_authentication=disabled
--rpc_encryption=disabled with env {}
I20260504 14:07:58.045706 31895 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.041710 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:52159 (local address 127.25.254.254:46647)
0504 14:07:58.041873 (+   163us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:58.041877 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:58.042859 (+   982us) server_negotiation.cc:408] Connection header received
0504 14:07:58.043688 (+   829us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:58.043694 (+     6us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:58.043744 (+    50us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:58.043841 (+    97us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:58.045113 (+  1272us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:58.045120 (+     7us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:58.045126 (+     6us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:07:58.045141 (+    15us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:58.045167 (+    26us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:07:58.045179 (+    12us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:58.045271 (+    92us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:58.045412 (+   141us) server_negotiation.cc:300] Negotiation successful
0504 14:07:58.045449 (+    37us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":69}
I20260504 14:07:58.046116 32046 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.042035 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:46647 (local address 127.25.254.193:52159)
0504 14:07:58.042701 (+   666us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:58.042750 (+    49us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:58.043502 (+   752us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:58.044126 (+   624us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:58.044153 (+    27us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:58.044256 (+   103us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:58.044835 (+   579us) client_negotiation.cc:624] Initiating SASL PLAIN handshake
0504 14:07:58.044839 (+     4us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:58.044876 (+    37us) client_negotiation.cc:815] callback for SASL_CB_AUTHNAME
0504 14:07:58.044940 (+    64us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:58.045273 (+   333us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:58.045279 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:58.045283 (+     4us) client_negotiation.cc:770] Sending connection context
0504 14:07:58.045418 (+   135us) client_negotiation.cc:241] Negotiation successful
0504 14:07:58.045499 (+    81us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":292,"thread_start_us":116,"threads_started":1}
I20260504 14:07:58.047595 32044 heartbeater.cc:344] Connected to a master server at 127.25.254.254:46647
I20260504 14:07:58.047840 32044 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:58.048233 32044 heartbeater.cc:507] Master 127.25.254.254:46647 requested a full tablet report, sending...
I20260504 14:07:58.049724 31856 ts_manager.cc:194] Registered new tserver with Master: 9cda006d51c14c1cbc67a116171694a9 (127.25.254.193:36105)
W20260504 14:07:58.147264 32047 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:58.147493 32047 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:58.147545 32047 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:58.150925 32047 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:58.150982 32047 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:58.151098 32047 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:07:58.154948 32047 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38925
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-1/wal
--rpc_trace_negotiation=true
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=disabled
--rpc_encryption=disabled
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:46647
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.32047
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:58.156042 32047 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:58.157207 32047 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:58.164004 32053 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:58.164242 32047 server_base.cc:1061] running on GCE node
W20260504 14:07:58.164122 32052 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:58.164004 32055 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:58.164913 32047 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:58.165660 32047 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:58.166882 32047 hybrid_clock.cc:648] HybridClock initialized: now 1777903678166837 us; error 61 us; skew 500 ppm
I20260504 14:07:58.169008 32047 webserver.cc:492] Webserver started at http://127.25.254.194:35035/ using document root <none> and password file <none>
I20260504 14:07:58.169605 32047 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:58.169694 32047 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:58.169927 32047 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:58.171671 32047 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-1/data/instance:
uuid: "1746add1e64e4e5fb7a9dc6b875667cf"
format_stamp: "Formatted at 2026-05-04 14:07:58 on dist-test-slave-2x32"
server_key: "b13133a55c2b875ad18629ed78389292"
server_key_iv: "62022fc2913f60d3c1635f9de699b114"
server_key_version: "encryptionkey@0"
I20260504 14:07:58.172173 32047 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance:
uuid: "1746add1e64e4e5fb7a9dc6b875667cf"
format_stamp: "Formatted at 2026-05-04 14:07:58 on dist-test-slave-2x32"
server_key: "b13133a55c2b875ad18629ed78389292"
server_key_iv: "62022fc2913f60d3c1635f9de699b114"
server_key_version: "encryptionkey@0"
I20260504 14:07:58.177280 32047 fs_manager.cc:696] Time spent creating directory manager: real 0.005s	user 0.006s	sys 0.000s
I20260504 14:07:58.179953 32061 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:58.181229 32047 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.001s	sys 0.000s
I20260504 14:07:58.181380 32047 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "1746add1e64e4e5fb7a9dc6b875667cf"
format_stamp: "Formatted at 2026-05-04 14:07:58 on dist-test-slave-2x32"
server_key: "b13133a55c2b875ad18629ed78389292"
server_key_iv: "62022fc2913f60d3c1635f9de699b114"
server_key_version: "encryptionkey@0"
I20260504 14:07:58.181500 32047 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:58.186764 32047 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:58.187384 32047 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:58.187580 32047 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:58.188141 32047 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:58.189183 32047 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:58.189249 32047 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:58.189314 32047 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:58.189359 32047 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:58.198612 32047 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:44557
I20260504 14:07:58.198632 32174 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:44557 every 8 connection(s)
I20260504 14:07:58.199648 32047 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
I20260504 14:07:58.204975 31895 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.201456 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:35361 (local address 127.25.254.254:46647)
0504 14:07:58.201592 (+   136us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:58.201596 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:58.202454 (+   858us) server_negotiation.cc:408] Connection header received
0504 14:07:58.203288 (+   834us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:58.203293 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:58.203341 (+    48us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:58.203444 (+   103us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:58.204466 (+  1022us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:58.204471 (+     5us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:58.204474 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:07:58.204485 (+    11us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:58.204516 (+    31us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:07:58.204524 (+     8us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:58.204630 (+   106us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:58.204786 (+   156us) server_negotiation.cc:300] Negotiation successful
0504 14:07:58.204828 (+    42us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":43}
I20260504 14:07:58.205644 32177 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.201688 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:46647 (local address 127.25.254.194:35361)
0504 14:07:58.202320 (+   632us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:58.202352 (+    32us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:58.203089 (+   737us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:58.203594 (+   505us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:58.203602 (+     8us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:58.203662 (+    60us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:58.204247 (+   585us) client_negotiation.cc:624] Initiating SASL PLAIN handshake
0504 14:07:58.204249 (+     2us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:58.204274 (+    25us) client_negotiation.cc:815] callback for SASL_CB_AUTHNAME
0504 14:07:58.204319 (+    45us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:58.204622 (+   303us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:58.204628 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:58.204635 (+     7us) client_negotiation.cc:770] Sending connection context
0504 14:07:58.204826 (+   191us) client_negotiation.cc:241] Negotiation successful
0504 14:07:58.204927 (+   101us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":234,"thread_start_us":100,"threads_started":1}
I20260504 14:07:58.207020 32175 heartbeater.cc:344] Connected to a master server at 127.25.254.254:46647
I20260504 14:07:58.207306 32175 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:58.207759 32175 heartbeater.cc:507] Master 127.25.254.254:46647 requested a full tablet report, sending...
I20260504 14:07:58.209077 31856 ts_manager.cc:194] Registered new tserver with Master: 1746add1e64e4e5fb7a9dc6b875667cf (127.25.254.194:44557)
I20260504 14:07:58.209344 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 32047
I20260504 14:07:58.209481 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance
I20260504 14:07:58.209847 26619 external_mini_cluster.cc:1468] Setting key 9b1b198f7601ad70fbac03c75212b8b8
I20260504 14:07:58.212126 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:0
--local_ip_for_outbound_sockets=127.25.254.195
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:46647
--builtin_ntp_servers=127.25.254.212:38925
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--rpc_authentication=disabled
--rpc_encryption=disabled with env {}
W20260504 14:07:58.319918 32178 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:58.320153 32178 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:58.320205 32178 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:58.323578 32178 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:58.323642 32178 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:58.323719 32178 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:07:58.327948 32178 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38925
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-2/wal
--rpc_trace_negotiation=true
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=disabled
--rpc_encryption=disabled
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:46647
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.32178
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:58.329038 32178 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:58.330123 32178 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:58.337177 32183 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:58.337286 32178 server_base.cc:1061] running on GCE node
W20260504 14:07:58.337177 32186 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:58.337204 32184 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:58.337919 32178 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:58.338500 32178 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:58.339656 32178 hybrid_clock.cc:648] HybridClock initialized: now 1777903678339630 us; error 41 us; skew 500 ppm
I20260504 14:07:58.341538 32178 webserver.cc:492] Webserver started at http://127.25.254.195:41549/ using document root <none> and password file <none>
I20260504 14:07:58.342119 32178 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:58.342252 32178 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:58.342468 32178 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:58.344200 32178 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-2/data/instance:
uuid: "f64e1371a0194da0ac2051ce739e5db5"
format_stamp: "Formatted at 2026-05-04 14:07:58 on dist-test-slave-2x32"
server_key: "cbafe12e9e9649e7ea6828d2e55ca45b"
server_key_iv: "e7c285d73a8e93a916d403d5758c691c"
server_key_version: "encryptionkey@0"
I20260504 14:07:58.344767 32178 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance:
uuid: "f64e1371a0194da0ac2051ce739e5db5"
format_stamp: "Formatted at 2026-05-04 14:07:58 on dist-test-slave-2x32"
server_key: "cbafe12e9e9649e7ea6828d2e55ca45b"
server_key_iv: "e7c285d73a8e93a916d403d5758c691c"
server_key_version: "encryptionkey@0"
I20260504 14:07:58.348292 32178 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.003s	sys 0.002s
I20260504 14:07:58.350831 32192 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:58.351888 32178 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:07:58.351987 32178 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "f64e1371a0194da0ac2051ce739e5db5"
format_stamp: "Formatted at 2026-05-04 14:07:58 on dist-test-slave-2x32"
server_key: "cbafe12e9e9649e7ea6828d2e55ca45b"
server_key_iv: "e7c285d73a8e93a916d403d5758c691c"
server_key_version: "encryptionkey@0"
I20260504 14:07:58.352131 32178 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:58.356979 32178 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:58.357553 32178 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:58.357751 32178 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:58.358347 32178 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:07:58.359364 32178 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:07:58.359438 32178 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:58.359508 32178 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:07:58.359550 32178 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:58.369514 32178 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:46527
I20260504 14:07:58.369577 32305 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:46527 every 8 connection(s)
I20260504 14:07:58.370661 32178 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
I20260504 14:07:58.376281 31895 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.372576 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:59649 (local address 127.25.254.254:46647)
0504 14:07:58.372810 (+   234us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:58.372814 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:58.373628 (+   814us) server_negotiation.cc:408] Connection header received
0504 14:07:58.374565 (+   937us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:58.374569 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:58.374624 (+    55us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:58.374730 (+   106us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:58.375753 (+  1023us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:58.375757 (+     4us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:58.375760 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:07:58.375773 (+    13us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:58.375805 (+    32us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:07:58.375814 (+     9us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:58.375919 (+   105us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:58.376094 (+   175us) server_negotiation.cc:300] Negotiation successful
0504 14:07:58.376136 (+    42us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":104}
I20260504 14:07:58.376952 32308 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.372843 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:46647 (local address 127.25.254.195:59649)
0504 14:07:58.373433 (+   590us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:58.373464 (+    31us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:58.374355 (+   891us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:58.374895 (+   540us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:58.374905 (+    10us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:58.374969 (+    64us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:58.375510 (+   541us) client_negotiation.cc:624] Initiating SASL PLAIN handshake
0504 14:07:58.375512 (+     2us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:58.375537 (+    25us) client_negotiation.cc:815] callback for SASL_CB_AUTHNAME
0504 14:07:58.375594 (+    57us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:58.375936 (+   342us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:58.375947 (+    11us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:58.375956 (+     9us) client_negotiation.cc:770] Sending connection context
0504 14:07:58.376144 (+   188us) client_negotiation.cc:241] Negotiation successful
0504 14:07:58.376251 (+   107us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":258,"thread_start_us":96,"threads_started":1}
I20260504 14:07:58.378309 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 32178
I20260504 14:07:58.378386 32306 heartbeater.cc:344] Connected to a master server at 127.25.254.254:46647
I20260504 14:07:58.378448 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance
I20260504 14:07:58.378715 26619 external_mini_cluster.cc:1468] Setting key e185cb04b4bc63cdc04202f8cf768e71
I20260504 14:07:58.378746 32306 heartbeater.cc:461] Registering TS with master...
I20260504 14:07:58.379292 32306 heartbeater.cc:507] Master 127.25.254.254:46647 requested a full tablet report, sending...
I20260504 14:07:58.380390 31855 ts_manager.cc:194] Registered new tserver with Master: f64e1371a0194da0ac2051ce739e5db5 (127.25.254.195:46527)
I20260504 14:07:58.393186 26619 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
I20260504 14:07:58.397138 31895 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.395280 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:38694 (local address 127.25.254.254:46647)
0504 14:07:58.395441 (+   161us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:58.395447 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:58.395662 (+   215us) server_negotiation.cc:408] Connection header received
0504 14:07:58.395766 (+   104us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:58.395769 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:58.395833 (+    64us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:58.395903 (+    70us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:58.396670 (+   767us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:58.396674 (+     4us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:58.396676 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:07:58.396685 (+     9us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:58.396703 (+    18us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:07:58.396710 (+     7us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:58.396786 (+    76us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:58.396919 (+   133us) server_negotiation.cc:300] Negotiation successful
0504 14:07:58.396947 (+    28us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":58}
I20260504 14:07:58.400331 31855 catalog_manager.cc:2257] Servicing CreateTable request from {username='slave'} at 127.0.0.1:38694:
name: "test-table"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "val"
    type: INT32
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
W20260504 14:07:58.402956 31855 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-table in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260504 14:07:58.415550 32320 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.412590 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:44557 (local address 127.0.0.1:46946)
0504 14:07:58.413096 (+   506us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:58.413136 (+    40us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:58.413238 (+   102us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:58.414503 (+  1265us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:58.414507 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:58.414523 (+    16us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:58.414922 (+   399us) client_negotiation.cc:624] Initiating SASL PLAIN handshake
0504 14:07:58.414927 (+     5us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:58.414939 (+    12us) client_negotiation.cc:815] callback for SASL_CB_AUTHNAME
0504 14:07:58.414959 (+    20us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:58.415274 (+   315us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:58.415277 (+     3us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:58.415280 (+     3us) client_negotiation.cc:770] Sending connection context
0504 14:07:58.415353 (+    73us) client_negotiation.cc:241] Negotiation successful
0504 14:07:58.415398 (+    45us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":370,"thread_start_us":109,"threads_started":1}
I20260504 14:07:58.415561 32321 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.412997 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:46946 (local address 127.25.254.194:44557)
0504 14:07:58.413508 (+   511us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:58.413517 (+     9us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:58.413541 (+    24us) server_negotiation.cc:408] Connection header received
0504 14:07:58.413601 (+    60us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:58.413606 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:58.413761 (+   155us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:58.413903 (+   142us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:58.415085 (+  1182us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:58.415094 (+     9us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:58.415106 (+    12us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:07:58.415120 (+    14us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:58.415157 (+    37us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:07:58.415170 (+    13us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:58.415319 (+   149us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:58.415365 (+    46us) server_negotiation.cc:300] Negotiation successful
0504 14:07:58.415424 (+    59us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":409,"thread_start_us":88,"threads_started":1}
I20260504 14:07:58.415609 32318 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.412227 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:39526 (local address 127.25.254.195:46527)
0504 14:07:58.412558 (+   331us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:58.412567 (+     9us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:58.413432 (+   865us) server_negotiation.cc:408] Connection header received
0504 14:07:58.413653 (+   221us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:58.413658 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:58.413766 (+   108us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:58.413903 (+   137us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:58.414939 (+  1036us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:58.414946 (+     7us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:58.414963 (+    17us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:07:58.414973 (+    10us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:58.415031 (+    58us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:07:58.415044 (+    13us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:58.415175 (+   131us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:58.415342 (+   167us) server_negotiation.cc:300] Negotiation successful
0504 14:07:58.415429 (+    87us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":231,"thread_start_us":148,"threads_started":1}
I20260504 14:07:58.416009 32317 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.412081 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:46527 (local address 127.0.0.1:39526)
0504 14:07:58.413327 (+  1246us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:58.413343 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:58.413507 (+   164us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:58.414273 (+   766us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:58.414277 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:58.414292 (+    15us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:58.414722 (+   430us) client_negotiation.cc:624] Initiating SASL PLAIN handshake
0504 14:07:58.414724 (+     2us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:58.414735 (+    11us) client_negotiation.cc:815] callback for SASL_CB_AUTHNAME
0504 14:07:58.414775 (+    40us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:58.415150 (+   375us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:58.415157 (+     7us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:58.415163 (+     6us) client_negotiation.cc:770] Sending connection context
0504 14:07:58.415821 (+   658us) client_negotiation.cc:241] Negotiation successful
0504 14:07:58.415859 (+    38us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":561,"thread_start_us":81,"threads_started":1}
I20260504 14:07:58.416467 32319 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.412748 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:36105 (local address 127.0.0.1:46156)
0504 14:07:58.413381 (+   633us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:58.413393 (+    12us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:58.413454 (+    61us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:58.413913 (+   459us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:58.413918 (+     5us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:58.413977 (+    59us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:58.414700 (+   723us) client_negotiation.cc:624] Initiating SASL PLAIN handshake
0504 14:07:58.414703 (+     3us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:58.414723 (+    20us) client_negotiation.cc:815] callback for SASL_CB_AUTHNAME
0504 14:07:58.414779 (+    56us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:58.416229 (+  1450us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:58.416232 (+     3us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:58.416234 (+     2us) client_negotiation.cc:770] Sending connection context
0504 14:07:58.416325 (+    91us) client_negotiation.cc:241] Negotiation successful
0504 14:07:58.416354 (+    29us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":561,"thread_start_us":92,"threads_started":1}
I20260504 14:07:58.416958 32322 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.413041 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:46156 (local address 127.25.254.193:36105)
0504 14:07:58.413517 (+   476us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:58.413524 (+     7us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:58.413542 (+    18us) server_negotiation.cc:408] Connection header received
0504 14:07:58.413603 (+    61us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:58.413610 (+     7us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:58.413751 (+   141us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:58.413904 (+   153us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:58.415252 (+  1348us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:58.415259 (+     7us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:58.415270 (+    11us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:07:58.415284 (+    14us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:58.415944 (+   660us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:07:58.415964 (+    20us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:58.416682 (+   718us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:58.416740 (+    58us) server_negotiation.cc:300] Negotiation successful
0504 14:07:58.416803 (+    63us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":381,"thread_start_us":126,"threads_started":1}
I20260504 14:07:58.418954 32109 tablet_service.cc:1511] Processing CreateTablet for tablet 38347f39e51d49f1bdd9dc6de37c95f0 (DEFAULT_TABLE table=test-table [id=f13f9d9618e9496d93ccff3a0dd09027]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:58.418954 32239 tablet_service.cc:1511] Processing CreateTablet for tablet 38347f39e51d49f1bdd9dc6de37c95f0 (DEFAULT_TABLE table=test-table [id=f13f9d9618e9496d93ccff3a0dd09027]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:58.419482 31978 tablet_service.cc:1511] Processing CreateTablet for tablet 38347f39e51d49f1bdd9dc6de37c95f0 (DEFAULT_TABLE table=test-table [id=f13f9d9618e9496d93ccff3a0dd09027]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:07:58.419998 32109 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 38347f39e51d49f1bdd9dc6de37c95f0. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:58.420013 32239 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 38347f39e51d49f1bdd9dc6de37c95f0. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:58.420480 31978 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 38347f39e51d49f1bdd9dc6de37c95f0. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:58.425993 32323 tablet_bootstrap.cc:492] T 38347f39e51d49f1bdd9dc6de37c95f0 P 9cda006d51c14c1cbc67a116171694a9: Bootstrap starting.
I20260504 14:07:58.427182 32325 tablet_bootstrap.cc:492] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf: Bootstrap starting.
I20260504 14:07:58.428561 32323 tablet_bootstrap.cc:654] T 38347f39e51d49f1bdd9dc6de37c95f0 P 9cda006d51c14c1cbc67a116171694a9: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:58.429113 32325 tablet_bootstrap.cc:654] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:58.429423 32323 log.cc:826] T 38347f39e51d49f1bdd9dc6de37c95f0 P 9cda006d51c14c1cbc67a116171694a9: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:58.429843 32325 log.cc:826] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:58.430115 32324 tablet_bootstrap.cc:492] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5: Bootstrap starting.
I20260504 14:07:58.431723 32325 tablet_bootstrap.cc:492] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf: No bootstrap required, opened a new log
I20260504 14:07:58.431941 32325 ts_tablet_manager.cc:1403] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf: Time spent bootstrapping tablet: real 0.005s	user 0.003s	sys 0.000s
I20260504 14:07:58.431950 32324 tablet_bootstrap.cc:654] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:58.432570 32324 log.cc:826] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:58.434003 32323 tablet_bootstrap.cc:492] T 38347f39e51d49f1bdd9dc6de37c95f0 P 9cda006d51c14c1cbc67a116171694a9: No bootstrap required, opened a new log
I20260504 14:07:58.434221 32323 ts_tablet_manager.cc:1403] T 38347f39e51d49f1bdd9dc6de37c95f0 P 9cda006d51c14c1cbc67a116171694a9: Time spent bootstrapping tablet: real 0.008s	user 0.005s	sys 0.000s
I20260504 14:07:58.434461 32324 tablet_bootstrap.cc:492] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5: No bootstrap required, opened a new log
I20260504 14:07:58.434660 32324 ts_tablet_manager.cc:1403] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5: Time spent bootstrapping tablet: real 0.005s	user 0.004s	sys 0.000s
I20260504 14:07:58.434852 32325 raft_consensus.cc:359] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f64e1371a0194da0ac2051ce739e5db5" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 46527 } } peers { permanent_uuid: "9cda006d51c14c1cbc67a116171694a9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 36105 } } peers { permanent_uuid: "1746add1e64e4e5fb7a9dc6b875667cf" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44557 } }
I20260504 14:07:58.435098 32325 raft_consensus.cc:385] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:58.435165 32325 raft_consensus.cc:740] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 1746add1e64e4e5fb7a9dc6b875667cf, State: Initialized, Role: FOLLOWER
I20260504 14:07:58.435590 32325 consensus_queue.cc:260] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f64e1371a0194da0ac2051ce739e5db5" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 46527 } } peers { permanent_uuid: "9cda006d51c14c1cbc67a116171694a9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 36105 } } peers { permanent_uuid: "1746add1e64e4e5fb7a9dc6b875667cf" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44557 } }
I20260504 14:07:58.436235 32175 heartbeater.cc:499] Master 127.25.254.254:46647 was elected leader, sending a full tablet report...
I20260504 14:07:58.436481 32325 ts_tablet_manager.cc:1434] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf: Time spent starting tablet: real 0.004s	user 0.001s	sys 0.004s
I20260504 14:07:58.437294 32323 raft_consensus.cc:359] T 38347f39e51d49f1bdd9dc6de37c95f0 P 9cda006d51c14c1cbc67a116171694a9 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f64e1371a0194da0ac2051ce739e5db5" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 46527 } } peers { permanent_uuid: "9cda006d51c14c1cbc67a116171694a9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 36105 } } peers { permanent_uuid: "1746add1e64e4e5fb7a9dc6b875667cf" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44557 } }
I20260504 14:07:58.437503 32323 raft_consensus.cc:385] T 38347f39e51d49f1bdd9dc6de37c95f0 P 9cda006d51c14c1cbc67a116171694a9 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:58.437568 32323 raft_consensus.cc:740] T 38347f39e51d49f1bdd9dc6de37c95f0 P 9cda006d51c14c1cbc67a116171694a9 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 9cda006d51c14c1cbc67a116171694a9, State: Initialized, Role: FOLLOWER
I20260504 14:07:58.437502 32324 raft_consensus.cc:359] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f64e1371a0194da0ac2051ce739e5db5" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 46527 } } peers { permanent_uuid: "9cda006d51c14c1cbc67a116171694a9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 36105 } } peers { permanent_uuid: "1746add1e64e4e5fb7a9dc6b875667cf" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44557 } }
I20260504 14:07:58.437693 32324 raft_consensus.cc:385] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:58.437744 32324 raft_consensus.cc:740] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f64e1371a0194da0ac2051ce739e5db5, State: Initialized, Role: FOLLOWER
I20260504 14:07:58.438046 32323 consensus_queue.cc:260] T 38347f39e51d49f1bdd9dc6de37c95f0 P 9cda006d51c14c1cbc67a116171694a9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f64e1371a0194da0ac2051ce739e5db5" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 46527 } } peers { permanent_uuid: "9cda006d51c14c1cbc67a116171694a9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 36105 } } peers { permanent_uuid: "1746add1e64e4e5fb7a9dc6b875667cf" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44557 } }
I20260504 14:07:58.438062 32324 consensus_queue.cc:260] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f64e1371a0194da0ac2051ce739e5db5" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 46527 } } peers { permanent_uuid: "9cda006d51c14c1cbc67a116171694a9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 36105 } } peers { permanent_uuid: "1746add1e64e4e5fb7a9dc6b875667cf" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44557 } }
I20260504 14:07:58.438938 32324 ts_tablet_manager.cc:1434] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5: Time spent starting tablet: real 0.004s	user 0.004s	sys 0.000s
I20260504 14:07:58.438982 32306 heartbeater.cc:499] Master 127.25.254.254:46647 was elected leader, sending a full tablet report...
I20260504 14:07:58.439044 32044 heartbeater.cc:499] Master 127.25.254.254:46647 was elected leader, sending a full tablet report...
I20260504 14:07:58.440454 32323 ts_tablet_manager.cc:1434] T 38347f39e51d49f1bdd9dc6de37c95f0 P 9cda006d51c14c1cbc67a116171694a9: Time spent starting tablet: real 0.006s	user 0.001s	sys 0.003s
W20260504 14:07:58.450529 32176 tablet.cc:2404] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20260504 14:07:58.540995 32045 tablet.cc:2404] T 38347f39e51d49f1bdd9dc6de37c95f0 P 9cda006d51c14c1cbc67a116171694a9: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20260504 14:07:58.621667 32307 tablet.cc:2404] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20260504 14:07:58.626968 32329 raft_consensus.cc:493] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:07:58.627162 32329 raft_consensus.cc:515] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f64e1371a0194da0ac2051ce739e5db5" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 46527 } } peers { permanent_uuid: "9cda006d51c14c1cbc67a116171694a9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 36105 } } peers { permanent_uuid: "1746add1e64e4e5fb7a9dc6b875667cf" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44557 } }
I20260504 14:07:58.628288 32329 leader_election.cc:290] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers f64e1371a0194da0ac2051ce739e5db5 (127.25.254.195:46527), 9cda006d51c14c1cbc67a116171694a9 (127.25.254.193:36105)
I20260504 14:07:58.630900 32177 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.628644 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:46527 (local address 127.25.254.194:37451)
0504 14:07:58.628891 (+   247us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:58.628909 (+    18us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:58.629060 (+   151us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:58.629544 (+   484us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:58.629547 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:58.629560 (+    13us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:58.630187 (+   627us) client_negotiation.cc:624] Initiating SASL PLAIN handshake
0504 14:07:58.630191 (+     4us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:58.630207 (+    16us) client_negotiation.cc:815] callback for SASL_CB_AUTHNAME
0504 14:07:58.630236 (+    29us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:58.630604 (+   368us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:58.630609 (+     5us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:58.630613 (+     4us) client_negotiation.cc:770] Sending connection context
0504 14:07:58.630723 (+   110us) client_negotiation.cc:241] Negotiation successful
0504 14:07:58.630756 (+    33us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":77}
I20260504 14:07:58.630980 32318 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.628713 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:37451 (local address 127.25.254.195:46527)
0504 14:07:58.628892 (+   179us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:58.628899 (+     7us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:58.629001 (+   102us) server_negotiation.cc:408] Connection header received
0504 14:07:58.629304 (+   303us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:58.629309 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:58.629366 (+    57us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:58.629472 (+   106us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:58.630379 (+   907us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:58.630384 (+     5us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:58.630388 (+     4us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:07:58.630401 (+    13us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:58.630432 (+    31us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:07:58.630445 (+    13us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:58.630554 (+   109us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:58.630714 (+   160us) server_negotiation.cc:300] Negotiation successful
0504 14:07:58.630818 (+   104us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":72}
I20260504 14:07:58.631264 32322 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.629178 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:48705 (local address 127.25.254.193:36105)
0504 14:07:58.629344 (+   166us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:58.629349 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:58.629716 (+   367us) server_negotiation.cc:408] Connection header received
0504 14:07:58.629842 (+   126us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:58.629848 (+     6us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:58.629895 (+    47us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:58.629990 (+    95us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:58.630808 (+   818us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:58.630812 (+     4us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:58.630815 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:07:58.630827 (+    12us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:58.630850 (+    23us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:07:58.630859 (+     9us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:58.630953 (+    94us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:58.631098 (+   145us) server_negotiation.cc:300] Negotiation successful
0504 14:07:58.631145 (+    47us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":70}
I20260504 14:07:58.631661 32332 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.629018 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:36105 (local address 127.25.254.194:48705)
0504 14:07:58.629509 (+   491us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:58.629593 (+    84us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:58.629699 (+   106us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:58.630081 (+   382us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:58.630084 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:58.630118 (+    34us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:07:58.630626 (+   508us) client_negotiation.cc:624] Initiating SASL PLAIN handshake
0504 14:07:58.630628 (+     2us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:07:58.630642 (+    14us) client_negotiation.cc:815] callback for SASL_CB_AUTHNAME
0504 14:07:58.630670 (+    28us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:07:58.630961 (+   291us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:07:58.630967 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:07:58.630973 (+     6us) client_negotiation.cc:770] Sending connection context
0504 14:07:58.631458 (+   485us) client_negotiation.cc:241] Negotiation successful
0504 14:07:58.631496 (+    38us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":342,"thread_start_us":214,"threads_started":1}
I20260504 14:07:58.632109 32260 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "38347f39e51d49f1bdd9dc6de37c95f0" candidate_uuid: "1746add1e64e4e5fb7a9dc6b875667cf" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f64e1371a0194da0ac2051ce739e5db5" is_pre_election: true
I20260504 14:07:58.632328 31998 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "38347f39e51d49f1bdd9dc6de37c95f0" candidate_uuid: "1746add1e64e4e5fb7a9dc6b875667cf" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "9cda006d51c14c1cbc67a116171694a9" is_pre_election: true
I20260504 14:07:58.632437 32260 raft_consensus.cc:2468] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 1746add1e64e4e5fb7a9dc6b875667cf in term 0.
I20260504 14:07:58.632555 31998 raft_consensus.cc:2468] T 38347f39e51d49f1bdd9dc6de37c95f0 P 9cda006d51c14c1cbc67a116171694a9 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 1746add1e64e4e5fb7a9dc6b875667cf in term 0.
I20260504 14:07:58.632932 32065 leader_election.cc:304] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 1746add1e64e4e5fb7a9dc6b875667cf, f64e1371a0194da0ac2051ce739e5db5; no voters: 
I20260504 14:07:58.633164 32329 raft_consensus.cc:2804] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:07:58.633239 32329 raft_consensus.cc:493] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:07:58.633270 32329 raft_consensus.cc:3060] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:58.634420 32329 raft_consensus.cc:515] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f64e1371a0194da0ac2051ce739e5db5" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 46527 } } peers { permanent_uuid: "9cda006d51c14c1cbc67a116171694a9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 36105 } } peers { permanent_uuid: "1746add1e64e4e5fb7a9dc6b875667cf" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44557 } }
I20260504 14:07:58.634821 32329 leader_election.cc:290] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [CANDIDATE]: Term 1 election: Requested vote from peers f64e1371a0194da0ac2051ce739e5db5 (127.25.254.195:46527), 9cda006d51c14c1cbc67a116171694a9 (127.25.254.193:36105)
I20260504 14:07:58.635361 32260 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "38347f39e51d49f1bdd9dc6de37c95f0" candidate_uuid: "1746add1e64e4e5fb7a9dc6b875667cf" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f64e1371a0194da0ac2051ce739e5db5"
I20260504 14:07:58.635361 31998 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "38347f39e51d49f1bdd9dc6de37c95f0" candidate_uuid: "1746add1e64e4e5fb7a9dc6b875667cf" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "9cda006d51c14c1cbc67a116171694a9"
I20260504 14:07:58.635501 31998 raft_consensus.cc:3060] T 38347f39e51d49f1bdd9dc6de37c95f0 P 9cda006d51c14c1cbc67a116171694a9 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:58.635573 32260 raft_consensus.cc:3060] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:58.636919 32260 raft_consensus.cc:2468] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 1746add1e64e4e5fb7a9dc6b875667cf in term 1.
I20260504 14:07:58.637328 32065 leader_election.cc:304] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 1746add1e64e4e5fb7a9dc6b875667cf, f64e1371a0194da0ac2051ce739e5db5; no voters: 
I20260504 14:07:58.637362 31998 raft_consensus.cc:2468] T 38347f39e51d49f1bdd9dc6de37c95f0 P 9cda006d51c14c1cbc67a116171694a9 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 1746add1e64e4e5fb7a9dc6b875667cf in term 1.
I20260504 14:07:58.637557 32329 raft_consensus.cc:2804] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:58.637811 32329 raft_consensus.cc:697] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [term 1 LEADER]: Becoming Leader. State: Replica: 1746add1e64e4e5fb7a9dc6b875667cf, State: Running, Role: LEADER
I20260504 14:07:58.638154 32329 consensus_queue.cc:237] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f64e1371a0194da0ac2051ce739e5db5" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 46527 } } peers { permanent_uuid: "9cda006d51c14c1cbc67a116171694a9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 36105 } } peers { permanent_uuid: "1746add1e64e4e5fb7a9dc6b875667cf" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44557 } }
I20260504 14:07:58.642009 31854 catalog_manager.cc:5671] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf reported cstate change: term changed from 0 to 1, leader changed from <none> to 1746add1e64e4e5fb7a9dc6b875667cf (127.25.254.194). New cstate: current_term: 1 leader_uuid: "1746add1e64e4e5fb7a9dc6b875667cf" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f64e1371a0194da0ac2051ce739e5db5" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 46527 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "9cda006d51c14c1cbc67a116171694a9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 36105 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "1746add1e64e4e5fb7a9dc6b875667cf" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44557 } health_report { overall_health: HEALTHY } } }
I20260504 14:07:58.654063 32321 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.652221 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:46958 (local address 127.25.254.194:44557)
0504 14:07:58.652405 (+   184us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:58.652409 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:58.652424 (+    15us) server_negotiation.cc:408] Connection header received
0504 14:07:58.652525 (+   101us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:58.652527 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:58.652632 (+   105us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:58.652710 (+    78us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:58.653511 (+   801us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:58.653514 (+     3us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:58.653516 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:07:58.653526 (+    10us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:58.653593 (+    67us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:07:58.653602 (+     9us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:58.653682 (+    80us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:58.653839 (+   157us) server_negotiation.cc:300] Negotiation successful
0504 14:07:58.653870 (+    31us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":94}
I20260504 14:07:58.660442 32260 raft_consensus.cc:1275] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5 [term 1 FOLLOWER]: Refusing update from remote peer 1746add1e64e4e5fb7a9dc6b875667cf: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:07:58.660586 31998 raft_consensus.cc:1275] T 38347f39e51d49f1bdd9dc6de37c95f0 P 9cda006d51c14c1cbc67a116171694a9 [term 1 FOLLOWER]: Refusing update from remote peer 1746add1e64e4e5fb7a9dc6b875667cf: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:07:58.661459 32333 consensus_queue.cc:1048] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [LEADER]: Connected to new peer: Peer: permanent_uuid: "f64e1371a0194da0ac2051ce739e5db5" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 46527 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:07:58.661706 32329 consensus_queue.cc:1048] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [LEADER]: Connected to new peer: Peer: permanent_uuid: "9cda006d51c14c1cbc67a116171694a9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 36105 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:07:58.670502 32337 mvcc.cc:204] Tried to move back new op lower bound from 7282293467784282112 to 7282293467705274368. Current Snapshot: MvccSnapshot[applied={T|T < 7282293467784282112}]
I20260504 14:07:58.679860 31854 catalog_manager.cc:2507] Servicing SoftDeleteTable request from {username='slave'} at 127.0.0.1:38694:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:07:58.680032 31854 catalog_manager.cc:2755] Servicing DeleteTable request from {username='slave'} at 127.0.0.1:38694:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:07:58.683108 31854 catalog_manager.cc:5958] T 00000000000000000000000000000000 P 4b9062ad5cbf4337b13cde76e6d6adbc: Sending DeleteTablet for 3 replicas of tablet 38347f39e51d49f1bdd9dc6de37c95f0
I20260504 14:07:58.684331 32109 tablet_service.cc:1558] Processing DeleteTablet for tablet 38347f39e51d49f1bdd9dc6de37c95f0 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:58 UTC) from {username='slave'} at 127.0.0.1:46946
I20260504 14:07:58.684327 32239 tablet_service.cc:1558] Processing DeleteTablet for tablet 38347f39e51d49f1bdd9dc6de37c95f0 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:58 UTC) from {username='slave'} at 127.0.0.1:39526
I20260504 14:07:58.684404 31978 tablet_service.cc:1558] Processing DeleteTablet for tablet 38347f39e51d49f1bdd9dc6de37c95f0 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:07:58 UTC) from {username='slave'} at 127.0.0.1:46156
I20260504 14:07:58.684916 32346 tablet_replica.cc:333] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf: stopping tablet replica
I20260504 14:07:58.685273 32346 raft_consensus.cc:2243] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [term 1 LEADER]: Raft consensus shutting down.
I20260504 14:07:58.685303 32348 tablet_replica.cc:333] T 38347f39e51d49f1bdd9dc6de37c95f0 P 9cda006d51c14c1cbc67a116171694a9: stopping tablet replica
I20260504 14:07:58.685628 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 31916
I20260504 14:07:58.685921 32346 raft_consensus.cc:2272] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:07:58.686503 32347 tablet_replica.cc:333] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5: stopping tablet replica
I20260504 14:07:58.686793 32347 raft_consensus.cc:2243] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:07:58.687115 32347 raft_consensus.cc:2272] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:07:58.687803 32346 ts_tablet_manager.cc:1916] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:07:58.690629 32346 ts_tablet_manager.cc:1929] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:07:58.690743 32346 log.cc:1199] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-1/wal/wals/38347f39e51d49f1bdd9dc6de37c95f0
I20260504 14:07:58.691140 32346 ts_tablet_manager.cc:1950] T 38347f39e51d49f1bdd9dc6de37c95f0 P 1746add1e64e4e5fb7a9dc6b875667cf: Deleting consensus metadata
I20260504 14:07:58.691318 32347 ts_tablet_manager.cc:1916] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:07:58.692488 31840 catalog_manager.cc:5002] TS 1746add1e64e4e5fb7a9dc6b875667cf (127.25.254.194:44557): tablet 38347f39e51d49f1bdd9dc6de37c95f0 (table test-table [id=f13f9d9618e9496d93ccff3a0dd09027]) successfully deleted
I20260504 14:07:58.694541 32347 ts_tablet_manager.cc:1929] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:07:58.694654 32347 log.cc:1199] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestDisableAuthenticationEncryption.1777903638260922-26619-0/minicluster-data/ts-2/wal/wals/38347f39e51d49f1bdd9dc6de37c95f0
W20260504 14:07:58.694898 31843 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv got EOF from 127.25.254.193:36105 (error 108)
I20260504 14:07:58.695024 32347 ts_tablet_manager.cc:1950] T 38347f39e51d49f1bdd9dc6de37c95f0 P f64e1371a0194da0ac2051ce739e5db5: Deleting consensus metadata
I20260504 14:07:58.695139 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 32047
I20260504 14:07:58.697289 32319 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.696948 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:36105 (local address 127.0.0.1:46160)
0504 14:07:58.697124 (+   176us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:58.697191 (+    67us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.193:36105: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":69}
W20260504 14:07:58.697474 31843 catalog_manager.cc:4729] TS 9cda006d51c14c1cbc67a116171694a9 (127.25.254.193:36105): DeleteTablet:TABLET_DATA_DELETED RPC failed for tablet 38347f39e51d49f1bdd9dc6de37c95f0: Network error: Client connection negotiation failed: client connection to 127.25.254.193:36105: connect: Connection refused (error 111)
I20260504 14:07:58.697793 31843 catalog_manager.cc:5002] TS f64e1371a0194da0ac2051ce739e5db5 (127.25.254.195:46527): tablet 38347f39e51d49f1bdd9dc6de37c95f0 (table test-table [id=f13f9d9618e9496d93ccff3a0dd09027]) successfully deleted
I20260504 14:07:58.701247 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 32178
I20260504 14:07:58.707446 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 31825
2026-05-04T14:07:58Z chronyd exiting
[       OK ] SecurityITest.TestDisableAuthenticationEncryption (1072 ms)
[ RUN      ] SecurityITest.TestJwtMiniCluster
I20260504 14:07:58.729758 26619 mini_oidc.cc:150] Starting JWKS server
I20260504 14:07:58.731278 26619 webserver.cc:492] Webserver started at https://127.0.0.1:37143/ using document root <none> and password file <none>
I20260504 14:07:58.731467 26619 mini_oidc.cc:186] Starting OIDC Discovery server
I20260504 14:07:58.731738 26619 webserver.cc:492] Webserver started at http://127.0.0.1:42313/ using document root <none> and password file <none>
2026-05-04T14:07:58Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:07:58Z Disabled control of system clock
I20260504 14:07:58.740721 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniCluster.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniCluster.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniCluster.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:42231
--webserver_interface=127.25.254.254
--webserver_port=0
--builtin_ntp_servers=127.25.254.212:42013
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:42231
--enable_jwt_token_auth=true
--jwks_url=https://localhost:37143/jwks.json
--encrypt_data_at_rest=true
--rpc_trace_negotiation
--trusted_certificate_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniCluster.1777903638260922-26619-0/testchainca.cert with env {}
W20260504 14:07:58.845024 32355 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:07:58.845335 32355 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:07:58.845412 32355 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:07:58.848906 32355 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:07:58.848982 32355 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:07:58.848999 32355 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:07:58.849018 32355 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:07:58.849036 32355 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
W20260504 14:07:58.849056 32355 flags.cc:432] Enabled experimental flag: --enable_jwt_token_auth=true
W20260504 14:07:58.849069 32355 flags.cc:432] Enabled experimental flag: --jwks_url=https://localhost:37143/jwks.json
I20260504 14:07:58.853569 32355 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:42013
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniCluster.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:42231
--rpc_trace_negotiation=true
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:42231
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--enable_jwt_token_auth=true
--jwks_url=https://localhost:37143/jwks.json
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniCluster.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--trusted_certificate_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniCluster.1777903638260922-26619-0/testchainca.cert
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.32355
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniCluster.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:07:58.854694 32355 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:07:58.855609 32355 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:07:58.861552 32363 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:58.861836 32361 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:07:58.861864 32360 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:07:58.861814 32355 server_base.cc:1061] running on GCE node
I20260504 14:07:58.862550 32355 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:07:58.863481 32355 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:07:58.864646 32355 hybrid_clock.cc:648] HybridClock initialized: now 1777903678864618 us; error 42 us; skew 500 ppm
I20260504 14:07:58.866673 32355 webserver.cc:492] Webserver started at http://127.25.254.254:36457/ using document root <none> and password file <none>
I20260504 14:07:58.867268 32355 fs_manager.cc:362] Metadata directory not provided
I20260504 14:07:58.867324 32355 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:07:58.867533 32355 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:07:58.869251 32355 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniCluster.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "210c760ce9ff43e29cf10fd349c0ceb5"
format_stamp: "Formatted at 2026-05-04 14:07:58 on dist-test-slave-2x32"
server_key: "a190e73675a0181a97eb72c928f01f51"
server_key_iv: "17a829c645476e40da58b0af15fc959d"
server_key_version: "encryptionkey@0"
I20260504 14:07:58.869765 32355 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniCluster.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "210c760ce9ff43e29cf10fd349c0ceb5"
format_stamp: "Formatted at 2026-05-04 14:07:58 on dist-test-slave-2x32"
server_key: "a190e73675a0181a97eb72c928f01f51"
server_key_iv: "17a829c645476e40da58b0af15fc959d"
server_key_version: "encryptionkey@0"
I20260504 14:07:58.873236 32355 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.006s	sys 0.000s
I20260504 14:07:58.875603 32369 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:07:58.876716 32355 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.001s	sys 0.000s
I20260504 14:07:58.876863 32355 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniCluster.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "210c760ce9ff43e29cf10fd349c0ceb5"
format_stamp: "Formatted at 2026-05-04 14:07:58 on dist-test-slave-2x32"
server_key: "a190e73675a0181a97eb72c928f01f51"
server_key_iv: "17a829c645476e40da58b0af15fc959d"
server_key_version: "encryptionkey@0"
I20260504 14:07:58.876976 32355 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniCluster.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:07:58.905130 32355 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:07:58.905925 32355 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:07:58.906126 32355 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:07:58.914027 32355 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:42231
I20260504 14:07:58.914045 32421 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:42231 every 8 connection(s)
I20260504 14:07:58.915100 32355 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniCluster.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:07:58.916730 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 32355
I20260504 14:07:58.916891 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniCluster.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:07:58.917158 26619 external_mini_cluster.cc:1468] Setting key 8bbacd1c5f8a3230bdc158e302da357b
I20260504 14:07:58.918291 32422 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:07:58.923846 32422 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5: Bootstrap starting.
I20260504 14:07:58.924661 32425 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.918343 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:51468 (local address 127.25.254.254:42231)
0504 14:07:58.918933 (+   590us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:58.918942 (+     9us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:58.918976 (+    34us) server_negotiation.cc:408] Connection header received
0504 14:07:58.919604 (+   628us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:58.919620 (+    16us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:58.919948 (+   328us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:58.920271 (+   323us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:07:58.920799 (+   528us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:58.921548 (+   749us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:58.922312 (+   764us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:58.922611 (+   299us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:58.923146 (+   535us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:07:58.923187 (+    41us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:07:58.923198 (+    11us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:07:58.923572 (+   374us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:07:58.923611 (+    39us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:07:58.923624 (+    13us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:07:58.923737 (+   113us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:58.923912 (+   175us) server_negotiation.cc:300] Negotiation successful
0504 14:07:58.924060 (+   148us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":275,"thread_start_us":136,"threads_started":1}
I20260504 14:07:58.926038 32422 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5: Neither blocks nor log segments found. Creating new log.
I20260504 14:07:58.926930 32422 log.cc:826] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5: Log is configured to *not* fsync() on all Append() calls
I20260504 14:07:58.927412 26619 external_mini_cluster.cc:949] 0 TS(s) registered with all masters
I20260504 14:07:58.928936 32422 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5: No bootstrap required, opened a new log
I20260504 14:07:58.931392 32422 raft_consensus.cc:359] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "210c760ce9ff43e29cf10fd349c0ceb5" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 42231 } }
I20260504 14:07:58.931597 32422 raft_consensus.cc:385] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:07:58.931655 32422 raft_consensus.cc:740] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 210c760ce9ff43e29cf10fd349c0ceb5, State: Initialized, Role: FOLLOWER
I20260504 14:07:58.932103 32422 consensus_queue.cc:260] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "210c760ce9ff43e29cf10fd349c0ceb5" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 42231 } }
I20260504 14:07:58.932245 32422 raft_consensus.cc:399] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:07:58.932312 32422 raft_consensus.cc:493] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:07:58.932401 32422 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:07:58.933212 32422 raft_consensus.cc:515] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "210c760ce9ff43e29cf10fd349c0ceb5" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 42231 } }
I20260504 14:07:58.933547 32422 leader_election.cc:304] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 210c760ce9ff43e29cf10fd349c0ceb5; no voters: 
I20260504 14:07:58.933848 32422 leader_election.cc:290] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:07:58.934223 32427 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:07:58.934455 32427 raft_consensus.cc:697] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5 [term 1 LEADER]: Becoming Leader. State: Replica: 210c760ce9ff43e29cf10fd349c0ceb5, State: Running, Role: LEADER
I20260504 14:07:58.934815 32427 consensus_queue.cc:237] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "210c760ce9ff43e29cf10fd349c0ceb5" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 42231 } }
I20260504 14:07:58.935061 32422 sys_catalog.cc:565] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5 [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:07:58.936470 32429 sys_catalog.cc:455] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 210c760ce9ff43e29cf10fd349c0ceb5. Latest consensus state: current_term: 1 leader_uuid: "210c760ce9ff43e29cf10fd349c0ceb5" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "210c760ce9ff43e29cf10fd349c0ceb5" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 42231 } } }
I20260504 14:07:58.936618 32429 sys_catalog.cc:458] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5 [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:58.936952 32428 sys_catalog.cc:455] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "210c760ce9ff43e29cf10fd349c0ceb5" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "210c760ce9ff43e29cf10fd349c0ceb5" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 42231 } } }
I20260504 14:07:58.937047 32428 sys_catalog.cc:458] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5 [sys.catalog]: This master's current role is: LEADER
I20260504 14:07:58.937031 32436 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:07:58.941262 32436 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:07:58.947431 32436 catalog_manager.cc:1357] Generated new cluster ID: 554d2d1de00341aea8dbbbee36f7566e
I20260504 14:07:58.947525 32436 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:07:58.960184 32436 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:07:58.961412 32436 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:07:58.976590 32436 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 210c760ce9ff43e29cf10fd349c0ceb5: Generated new TSK 0
I20260504 14:07:58.977327 32436 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20260504 14:07:59.016129 32425 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:58.992749 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:51476 (local address 127.25.254.254:42231)
0504 14:07:58.992923 (+   174us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:58.992928 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:58.993122 (+   194us) server_negotiation.cc:408] Connection header received
0504 14:07:58.993347 (+   225us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:58.993350 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:58.993409 (+    59us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:58.993505 (+    96us) server_negotiation.cc:227] Negotiated authn=JWT
0504 14:07:58.993971 (+   466us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:58.994661 (+   690us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:58.995632 (+   971us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:58.995827 (+   195us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:58.995912 (+    85us) server_negotiation.cc:366] Received JWT_EXCHANGE NegotiatePB request
0504 14:07:59.015148 (+ 19236us) server_negotiation.cc:378] Sending JWT_EXCHANGE NegotiatePB response
0504 14:07:59.015251 (+   103us) server_negotiation.cc:1036] Waiting for connection context
0504 14:07:59.015723 (+   472us) server_negotiation.cc:300] Negotiation successful
0504 14:07:59.015818 (+    95us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":65,"thread_start_us":164,"threads_started":1}
I20260504 14:07:59.028292 32425 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:07:59.024533 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:51480 (local address 127.25.254.254:42231)
0504 14:07:59.024762 (+   229us) server_negotiation.cc:207] Beginning negotiation
0504 14:07:59.024765 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:07:59.024789 (+    24us) server_negotiation.cc:408] Connection header received
0504 14:07:59.024945 (+   156us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:07:59.024948 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:07:59.024991 (+    43us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:07:59.025060 (+    69us) server_negotiation.cc:227] Negotiated authn=JWT
0504 14:07:59.025525 (+   465us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:59.026068 (+   543us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:07:59.027006 (+   938us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:07:59.027151 (+   145us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:59.027192 (+    41us) server_negotiation.cc:366] Received JWT_EXCHANGE NegotiatePB request
0504 14:07:59.028011 (+   819us) server_negotiation.cc:395] Sending RPC error: FATAL_INVALID_JWT: Not authorized: JWT verification failed: failed to verify signature: VerifyFinal failed (error:0407008A:rsa routines:RSA_padding_check_PKCS1_type_1:invalid padding:../crypto/rsa/rsa_pk1.c:67 error:04067072:rsa routines:rsa_ossl_public_decrypt:padding check failed:../crypto/rsa/rsa_ossl.c:581)
0504 14:07:59.028194 (+   183us) negotiation.cc:326] Negotiation complete: Not authorized: Server connection negotiation failed: server connection from 127.0.0.1:51480: JWT verification failed: failed to verify signature: VerifyFinal failed (error:0407008A:rsa routines:RSA_padding_check_PKCS1_type_1:invalid padding:../crypto/rsa/rsa_pk1.c:67 error:04067072:rsa routines:rsa_ossl_public_decrypt:padding check failed:../crypto/rsa/rsa_ossl.c:581)
Metrics: {"server-negotiator.queue_time_us":152}
W20260504 14:07:59.028386 32425 negotiation.cc:343] Unauthorized connection attempt: Server connection negotiation failed: server connection from 127.0.0.1:51480: JWT verification failed: failed to verify signature: VerifyFinal failed (error:0407008A:rsa routines:RSA_padding_check_PKCS1_type_1:invalid padding:../crypto/rsa/rsa_pk1.c:67 error:04067072:rsa routines:rsa_ossl_public_decrypt:padding check failed:../crypto/rsa/rsa_ossl.c:581)
W20260504 14:07:59.029207 32458 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:07:59.024302 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:42231 (local address 127.0.0.1:51480)
0504 14:07:59.024716 (+   414us) negotiation.cc:107] Waiting for socket to connect
0504 14:07:59.024732 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:07:59.024841 (+   109us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:07:59.025095 (+   254us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:07:59.025099 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:07:59.025108 (+     9us) client_negotiation.cc:190] Negotiated authn=JWT
0504 14:07:59.025372 (+   264us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:59.025383 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:59.026215 (+   832us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:07:59.026220 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:07:59.026879 (+   659us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:07:59.026891 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:07:59.027009 (+   118us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:07:59.027040 (+    31us) client_negotiation.cc:253] Sending JWT_EXCHANGE NegotiatePB request
0504 14:07:59.028196 (+  1156us) client_negotiation.cc:284] Received error response from server: Runtime error: FATAL_INVALID_JWT: Not authorized: JWT verification failed: failed to verify signature: VerifyFinal failed (error:0407008A:rsa routines:RSA_padding_check_PKCS1_type_1:invalid padding:../crypto/rsa/rsa_pk1.c:67 error:04067072:rsa routines:rsa_ossl_public_decrypt:padding check failed:../crypto/rsa/rsa_ossl.c:581)
0504 14:07:59.028359 (+   163us) negotiation.cc:326] Negotiation complete: Runtime error: Client connection negotiation failed: client connection to 127.25.254.254:42231: FATAL_INVALID_JWT: Not authorized: JWT verification failed: failed to verify signature: VerifyFinal failed (error:0407008A:rsa routines:RSA_padding_check_PKCS1_type_1:invalid padding:../crypto/rsa/rsa_pk1.c:67 error:04067072:rsa routines:rsa_ossl_public_decrypt:padding check failed:../crypto/rsa/rsa_ossl.c:581)
Metrics: {"client-negotiator.queue_time_us":320,"thread_start_us":163,"threads_started":1}
I20260504 14:08:14.038478 32464 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:14.034723 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:37618 (local address 127.25.254.254:42231)
0504 14:08:14.034996 (+   273us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:14.035000 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:14.035015 (+    15us) server_negotiation.cc:408] Connection header received
0504 14:08:14.035165 (+   150us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:14.035168 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:14.035215 (+    47us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:14.035303 (+    88us) server_negotiation.cc:227] Negotiated authn=JWT
0504 14:08:14.035821 (+   518us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:14.036457 (+   636us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:14.037282 (+   825us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:14.037491 (+   209us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:14.037537 (+    46us) server_negotiation.cc:366] Received JWT_EXCHANGE NegotiatePB request
0504 14:08:14.038224 (+   687us) server_negotiation.cc:395] Sending RPC error: FATAL_INVALID_JWT: Not authorized: JWT verification failed: token expired
0504 14:08:14.038366 (+   142us) negotiation.cc:326] Negotiation complete: Not authorized: Server connection negotiation failed: server connection from 127.0.0.1:37618: JWT verification failed: token expired
Metrics: {"server-negotiator.queue_time_us":205,"thread_start_us":109,"threads_started":1}
W20260504 14:08:14.038543 32464 negotiation.cc:343] Unauthorized connection attempt: Server connection negotiation failed: server connection from 127.0.0.1:37618: JWT verification failed: token expired
W20260504 14:08:14.038746 32463 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:14.034567 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:42231 (local address 127.0.0.1:37618)
0504 14:08:14.034936 (+   369us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:14.034955 (+    19us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:14.035063 (+   108us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:14.035365 (+   302us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:14.035368 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:14.035375 (+     7us) client_negotiation.cc:190] Negotiated authn=JWT
0504 14:08:14.035658 (+   283us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:14.035670 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:14.036651 (+   981us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:14.036654 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:14.037167 (+   513us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:14.037177 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:14.037277 (+   100us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:14.037297 (+    20us) client_negotiation.cc:253] Sending JWT_EXCHANGE NegotiatePB request
0504 14:08:14.038361 (+  1064us) client_negotiation.cc:284] Received error response from server: Runtime error: FATAL_INVALID_JWT: Not authorized: JWT verification failed: token expired
0504 14:08:14.038508 (+   147us) negotiation.cc:326] Negotiation complete: Runtime error: Client connection negotiation failed: client connection to 127.25.254.254:42231: FATAL_INVALID_JWT: Not authorized: JWT verification failed: token expired
Metrics: {"client-negotiator.queue_time_us":202,"thread_start_us":130,"threads_started":1}
W20260504 14:08:14.041487 32464 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:14.040617 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:37628 (local address 127.25.254.254:42231)
0504 14:08:14.040774 (+   157us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:14.040779 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:14.040889 (+   110us) server_negotiation.cc:408] Connection header received
0504 14:08:14.041038 (+   149us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:14.041042 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:14.041097 (+    55us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:14.041186 (+    89us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:14.041378 (+   192us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.0.0.1:37628: BlockingRecv error: recv got EOF from 127.0.0.1:37628 (error 108)
Metrics: {"server-negotiator.queue_time_us":67}
W20260504 14:08:14.045156 32474 client_negotiation.cc:376] the client has a JWT but it isn't advertising its JWT-based authn capability since it doesn't trust the certificate of the server at 127.25.254.254:42231
W20260504 14:08:14.045769 32464 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:14.044853 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:37634 (local address 127.25.254.254:42231)
0504 14:08:14.044983 (+   130us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:14.044986 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:14.045117 (+   131us) server_negotiation.cc:408] Connection header received
0504 14:08:14.045331 (+   214us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:14.045335 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:14.045391 (+    56us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:14.045479 (+    88us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:14.045662 (+   183us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.0.0.1:37634: BlockingRecv error: recv got EOF from 127.0.0.1:37634 (error 108)
Metrics: {"server-negotiator.queue_time_us":53}
I20260504 14:08:14.046056 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 32355
2026-05-04T14:08:14Z chronyd exiting
[       OK ] SecurityITest.TestJwtMiniCluster (15357 ms)
[ RUN      ] SecurityITest.TestJwtMiniClusterWithInvalidCert
I20260504 14:08:14.086911 26619 mini_oidc.cc:150] Starting JWKS server
I20260504 14:08:14.087862 26619 webserver.cc:492] Webserver started at https://127.0.0.1:38797/ using document root <none> and password file <none>
I20260504 14:08:14.088004 26619 mini_oidc.cc:186] Starting OIDC Discovery server
I20260504 14:08:14.088245 26619 webserver.cc:492] Webserver started at http://127.0.0.1:38191/ using document root <none> and password file <none>
2026-05-04T14:08:14Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:08:14Z Disabled control of system clock
I20260504 14:08:14.096805 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithInvalidCert.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithInvalidCert.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithInvalidCert.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithInvalidCert.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:44211
--webserver_interface=127.25.254.254
--webserver_port=0
--builtin_ntp_servers=127.25.254.212:46639
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:44211
--enable_jwt_token_auth=true
--jwks_url=https://localhost:38797/jwks.json
--encrypt_data_at_rest=true
--rpc_trace_negotiation
--trusted_certificate_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithInvalidCert.1777903638260922-26619-0/testchainca.cert with env {}
W20260504 14:08:14.203987 32480 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:14.204234 32480 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:14.204286 32480 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:14.207758 32480 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:08:14.207829 32480 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:14.207846 32480 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:14.207864 32480 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:08:14.207882 32480 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
W20260504 14:08:14.207901 32480 flags.cc:432] Enabled experimental flag: --enable_jwt_token_auth=true
W20260504 14:08:14.207916 32480 flags.cc:432] Enabled experimental flag: --jwks_url=https://localhost:38797/jwks.json
I20260504 14:08:14.212574 32480 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:46639
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithInvalidCert.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithInvalidCert.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:44211
--rpc_trace_negotiation=true
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:44211
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--enable_jwt_token_auth=true
--jwks_url=https://localhost:38797/jwks.json
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithInvalidCert.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--trusted_certificate_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithInvalidCert.1777903638260922-26619-0/testchainca.cert
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.32480
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithInvalidCert.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:14.213660 32480 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:14.214586 32480 file_cache.cc:492] Constructed file cache file cache with capacity 419430
I20260504 14:08:14.220340 32480 server_base.cc:1061] running on GCE node
W20260504 14:08:14.220273 32486 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:14.220273 32488 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:14.220279 32485 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:14.221138 32480 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:14.222105 32480 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:14.223315 32480 hybrid_clock.cc:648] HybridClock initialized: now 1777903694223289 us; error 41 us; skew 500 ppm
I20260504 14:08:14.225214 32480 webserver.cc:492] Webserver started at http://127.25.254.254:39325/ using document root <none> and password file <none>
I20260504 14:08:14.225793 32480 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:14.225888 32480 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:14.226131 32480 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:14.227840 32480 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithInvalidCert.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "5eb7c1c37c8e42318a46e9495410a6df"
format_stamp: "Formatted at 2026-05-04 14:08:14 on dist-test-slave-2x32"
server_key: "e4ab20cd96adae56475fa154af287fbc"
server_key_iv: "829d7541094c8569313784e547839b21"
server_key_version: "encryptionkey@0"
I20260504 14:08:14.228350 32480 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithInvalidCert.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "5eb7c1c37c8e42318a46e9495410a6df"
format_stamp: "Formatted at 2026-05-04 14:08:14 on dist-test-slave-2x32"
server_key: "e4ab20cd96adae56475fa154af287fbc"
server_key_iv: "829d7541094c8569313784e547839b21"
server_key_version: "encryptionkey@0"
I20260504 14:08:14.232005 32480 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.005s	sys 0.000s
I20260504 14:08:14.234484 32494 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:14.235785 32480 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.001s
I20260504 14:08:14.235977 32480 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithInvalidCert.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithInvalidCert.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "5eb7c1c37c8e42318a46e9495410a6df"
format_stamp: "Formatted at 2026-05-04 14:08:14 on dist-test-slave-2x32"
server_key: "e4ab20cd96adae56475fa154af287fbc"
server_key_iv: "829d7541094c8569313784e547839b21"
server_key_version: "encryptionkey@0"
I20260504 14:08:14.236100 32480 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithInvalidCert.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithInvalidCert.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithInvalidCert.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:14.258363 32480 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:14.259048 32480 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:14.259241 32480 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:14.267700 32480 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:44211
I20260504 14:08:14.267771 32546 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:44211 every 8 connection(s)
I20260504 14:08:14.268829 32480 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithInvalidCert.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:08:14.272522 32547 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:14.272840 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 32480
I20260504 14:08:14.272974 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithInvalidCert.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:08:14.273226 26619 external_mini_cluster.cc:1468] Setting key ce810ae7bc87847c6d758b7e85025596
I20260504 14:08:14.279896 32547 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df: Bootstrap starting.
I20260504 14:08:14.281663 32550 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:14.274490 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:45388 (local address 127.25.254.254:44211)
0504 14:08:14.275114 (+   624us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:14.275123 (+     9us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:14.275158 (+    35us) server_negotiation.cc:408] Connection header received
0504 14:08:14.276078 (+   920us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:14.276103 (+    25us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:14.276449 (+   346us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:14.276814 (+   365us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:14.277350 (+   536us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:14.278128 (+   778us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:14.278949 (+   821us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:14.279250 (+   301us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:14.280042 (+   792us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:14.280099 (+    57us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:14.280122 (+    23us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:08:14.280516 (+   394us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:14.280552 (+    36us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:08:14.280566 (+    14us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:14.280703 (+   137us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:14.280914 (+   211us) server_negotiation.cc:300] Negotiation successful
0504 14:08:14.281074 (+   160us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":328,"thread_start_us":123,"threads_started":1}
I20260504 14:08:14.282362 32547 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:14.283147 32547 log.cc:826] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:14.284627 26619 external_mini_cluster.cc:949] 0 TS(s) registered with all masters
I20260504 14:08:14.285297 32547 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df: No bootstrap required, opened a new log
I20260504 14:08:14.288158 32547 raft_consensus.cc:359] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5eb7c1c37c8e42318a46e9495410a6df" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44211 } }
I20260504 14:08:14.288398 32547 raft_consensus.cc:385] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:14.288484 32547 raft_consensus.cc:740] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5eb7c1c37c8e42318a46e9495410a6df, State: Initialized, Role: FOLLOWER
I20260504 14:08:14.288955 32547 consensus_queue.cc:260] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5eb7c1c37c8e42318a46e9495410a6df" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44211 } }
I20260504 14:08:14.289076 32547 raft_consensus.cc:399] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:08:14.289116 32547 raft_consensus.cc:493] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:08:14.289177 32547 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:14.290067 32547 raft_consensus.cc:515] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5eb7c1c37c8e42318a46e9495410a6df" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44211 } }
I20260504 14:08:14.290457 32547 leader_election.cc:304] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 5eb7c1c37c8e42318a46e9495410a6df; no voters: 
I20260504 14:08:14.290753 32547 leader_election.cc:290] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:08:14.290983 32552 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:08:14.291317 32552 raft_consensus.cc:697] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df [term 1 LEADER]: Becoming Leader. State: Replica: 5eb7c1c37c8e42318a46e9495410a6df, State: Running, Role: LEADER
I20260504 14:08:14.291775 32552 consensus_queue.cc:237] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5eb7c1c37c8e42318a46e9495410a6df" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44211 } }
I20260504 14:08:14.292045 32547 sys_catalog.cc:565] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:08:14.293776 32553 sys_catalog.cc:455] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "5eb7c1c37c8e42318a46e9495410a6df" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5eb7c1c37c8e42318a46e9495410a6df" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44211 } } }
I20260504 14:08:14.293855 32554 sys_catalog.cc:455] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df [sys.catalog]: SysCatalogTable state changed. Reason: New leader 5eb7c1c37c8e42318a46e9495410a6df. Latest consensus state: current_term: 1 leader_uuid: "5eb7c1c37c8e42318a46e9495410a6df" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5eb7c1c37c8e42318a46e9495410a6df" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44211 } } }
I20260504 14:08:14.293975 32554 sys_catalog.cc:458] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:14.293919 32553 sys_catalog.cc:458] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:14.294569 32564 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:08:14.297894 32564 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:08:14.304127 32564 catalog_manager.cc:1357] Generated new cluster ID: 5d85b9920d374363926707992305a2d6
I20260504 14:08:14.304220 32564 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:08:14.356827 32564 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:08:14.358238 32564 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:08:14.367857 32564 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 5eb7c1c37c8e42318a46e9495410a6df: Generated new TSK 0
I20260504 14:08:14.368644 32564 catalog_manager.cc:1524] Initializing in-progress tserver states...
E20260504 14:08:14.407160 32577 webserver.cc:578] Webserver: sslize failed: error:14094418:SSL routines:ssl3_read_bytes:tlsv1 alert unknown ca
E20260504 14:08:14.407258 32577 webserver.cc:578] Webserver: error shutting down SSL connection: error:140E0197:SSL routines:SSL_shutdown:shutdown while in init
I20260504 14:08:14.407872 32550 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:14.383577 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:45392 (local address 127.25.254.254:44211)
0504 14:08:14.383738 (+   161us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:14.383742 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:14.383850 (+   108us) server_negotiation.cc:408] Connection header received
0504 14:08:14.384034 (+   184us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:14.384037 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:14.384098 (+    61us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:14.384185 (+    87us) server_negotiation.cc:227] Negotiated authn=JWT
0504 14:08:14.384663 (+   478us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:14.385262 (+   599us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:14.386324 (+  1062us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:14.386494 (+   170us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:14.386605 (+   111us) server_negotiation.cc:366] Received JWT_EXCHANGE NegotiatePB request
0504 14:08:14.407495 (+ 20890us) server_negotiation.cc:395] Sending RPC error: FATAL_INVALID_JWT: Not authorized: Failed to load JWKS: Error downloading JWKS from 'https://localhost:38797/jwks.json': curl error: SSL peer certificate or SSH remote key was not OK: SSL certificate problem: unable to get local issuer certificate
0504 14:08:14.407746 (+   251us) negotiation.cc:326] Negotiation complete: Not authorized: Server connection negotiation failed: server connection from 127.0.0.1:45392: Failed to load JWKS: Error downloading JWKS from 'https://localhost:38797/jwks.json': curl error: SSL peer certificate or SSH remote key was not OK: SSL certificate problem: unable to get local issuer certificate
Metrics: {"server-negotiator.queue_time_us":74}
W20260504 14:08:14.407976 32550 negotiation.cc:343] Unauthorized connection attempt: Server connection negotiation failed: server connection from 127.0.0.1:45392: Failed to load JWKS: Error downloading JWKS from 'https://localhost:38797/jwks.json': curl error: SSL peer certificate or SSH remote key was not OK: SSL certificate problem: unable to get local issuer certificate
W20260504 14:08:14.408077 32576 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:14.383389 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:44211 (local address 127.0.0.1:45392)
0504 14:08:14.383742 (+   353us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:14.383759 (+    17us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:14.383898 (+   139us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:14.384218 (+   320us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:14.384222 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:14.384231 (+     9us) client_negotiation.cc:190] Negotiated authn=JWT
0504 14:08:14.384518 (+   287us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:14.384528 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:14.385394 (+   866us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:14.385398 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:14.386083 (+   685us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:14.386095 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:14.386266 (+   171us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:14.386297 (+    31us) client_negotiation.cc:253] Sending JWT_EXCHANGE NegotiatePB request
0504 14:08:14.407723 (+ 21426us) client_negotiation.cc:284] Received error response from server: Runtime error: FATAL_INVALID_JWT: Not authorized: Failed to load JWKS: Error downloading JWKS from 'https://localhost:38797/jwks.json': curl error: SSL peer certificate or SSH remote key was not OK: SSL certificate problem: unable to get local issuer certificate
0504 14:08:14.407902 (+   179us) negotiation.cc:326] Negotiation complete: Runtime error: Client connection negotiation failed: client connection to 127.25.254.254:44211: FATAL_INVALID_JWT: Not authorized: Failed to load JWKS: Error downloading JWKS from 'https://localhost:38797/jwks.json': curl error: SSL peer certificate or SSH remote key was not OK: SSL certificate problem: unable to get local issuer certificate
Metrics: {"client-negotiator.queue_time_us":257,"thread_start_us":161,"threads_started":1}
I20260504 14:08:14.408782 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 32480
2026-05-04T14:08:14Z chronyd exiting
[       OK ] SecurityITest.TestJwtMiniClusterWithInvalidCert (362 ms)
[ RUN      ] SecurityITest.TestJwtMiniClusterWithUntrustedCert
I20260504 14:08:14.449131 26619 mini_oidc.cc:150] Starting JWKS server
I20260504 14:08:14.450078 26619 webserver.cc:492] Webserver started at https://127.0.0.1:33845/ using document root <none> and password file <none>
I20260504 14:08:14.450181 26619 mini_oidc.cc:186] Starting OIDC Discovery server
I20260504 14:08:14.450429 26619 webserver.cc:492] Webserver started at http://127.0.0.1:34571/ using document root <none> and password file <none>
2026-05-04T14:08:14Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:08:14Z Disabled control of system clock
I20260504 14:08:14.458936 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithUntrustedCert.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithUntrustedCert.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithUntrustedCert.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithUntrustedCert.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:44253
--webserver_interface=127.25.254.254
--webserver_port=0
--builtin_ntp_servers=127.25.254.212:38603
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:44253
--enable_jwt_token_auth=true
--jwks_url=https://localhost:33845/jwks.json
--encrypt_data_at_rest=true
--rpc_trace_negotiation with env {}
W20260504 14:08:14.566310 32583 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:14.566570 32583 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:14.566623 32583 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:14.570039 32583 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:08:14.570108 32583 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:14.570125 32583 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:14.570144 32583 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:08:14.570233 32583 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
W20260504 14:08:14.570267 32583 flags.cc:432] Enabled experimental flag: --enable_jwt_token_auth=true
W20260504 14:08:14.570302 32583 flags.cc:432] Enabled experimental flag: --jwks_url=https://localhost:33845/jwks.json
I20260504 14:08:14.574406 32583 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:38603
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithUntrustedCert.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithUntrustedCert.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:44253
--rpc_trace_negotiation=true
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:44253
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--enable_jwt_token_auth=true
--jwks_url=https://localhost:33845/jwks.json
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithUntrustedCert.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.32583
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithUntrustedCert.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:14.575515 32583 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:14.576660 32583 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:14.583177 32591 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:14.583271 32589 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:14.583451 32583 server_base.cc:1061] running on GCE node
W20260504 14:08:14.583180 32588 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:14.584144 32583 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:14.585080 32583 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:14.586253 32583 hybrid_clock.cc:648] HybridClock initialized: now 1777903694586228 us; error 38 us; skew 500 ppm
I20260504 14:08:14.588316 32583 webserver.cc:492] Webserver started at http://127.25.254.254:43315/ using document root <none> and password file <none>
I20260504 14:08:14.588884 32583 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:14.588938 32583 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:14.589104 32583 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:14.590884 32583 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithUntrustedCert.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "c75bb12cf8c84c5b95fb8aef248c64b8"
format_stamp: "Formatted at 2026-05-04 14:08:14 on dist-test-slave-2x32"
server_key: "0c522e11639d9d9a5d5b7a72c4f4ff60"
server_key_iv: "ba35d564e805ae13734bf7aafdbad4ad"
server_key_version: "encryptionkey@0"
I20260504 14:08:14.591347 32583 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithUntrustedCert.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "c75bb12cf8c84c5b95fb8aef248c64b8"
format_stamp: "Formatted at 2026-05-04 14:08:14 on dist-test-slave-2x32"
server_key: "0c522e11639d9d9a5d5b7a72c4f4ff60"
server_key_iv: "ba35d564e805ae13734bf7aafdbad4ad"
server_key_version: "encryptionkey@0"
I20260504 14:08:14.595201 32583 fs_manager.cc:696] Time spent creating directory manager: real 0.004s	user 0.003s	sys 0.000s
I20260504 14:08:14.597687 32597 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:14.598866 32583 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.001s
I20260504 14:08:14.598985 32583 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithUntrustedCert.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithUntrustedCert.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "c75bb12cf8c84c5b95fb8aef248c64b8"
format_stamp: "Formatted at 2026-05-04 14:08:14 on dist-test-slave-2x32"
server_key: "0c522e11639d9d9a5d5b7a72c4f4ff60"
server_key_iv: "ba35d564e805ae13734bf7aafdbad4ad"
server_key_version: "encryptionkey@0"
I20260504 14:08:14.599077 32583 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithUntrustedCert.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithUntrustedCert.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithUntrustedCert.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:14.622089 32583 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:14.622762 32583 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:14.622928 32583 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:14.630108 32583 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:44253
I20260504 14:08:14.630119 32649 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:44253 every 8 connection(s)
I20260504 14:08:14.631191 32583 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithUntrustedCert.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:08:14.634207 32650 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:14.634640 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 32583
I20260504 14:08:14.634763 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithUntrustedCert.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:08:14.635003 26619 external_mini_cluster.cc:1468] Setting key 2678043b49b7b7b077715058eeded54a
I20260504 14:08:14.640589 32650 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8: Bootstrap starting.
I20260504 14:08:14.643172 32650 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:14.643214 32653 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:14.636371 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:60126 (local address 127.25.254.254:44253)
0504 14:08:14.636986 (+   615us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:14.636995 (+     9us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:14.637028 (+    33us) server_negotiation.cc:408] Connection header received
0504 14:08:14.637558 (+   530us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:14.637575 (+    17us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:14.637933 (+   358us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:14.638270 (+   337us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:14.638840 (+   570us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:14.639639 (+   799us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:14.640538 (+   899us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:14.640815 (+   277us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:14.641569 (+   754us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:14.641622 (+    53us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:14.641634 (+    12us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:08:14.642018 (+   384us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:14.642054 (+    36us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:08:14.642068 (+    14us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:14.642148 (+    80us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:14.642402 (+   254us) server_negotiation.cc:300] Negotiation successful
0504 14:08:14.642610 (+   208us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":301,"thread_start_us":131,"threads_started":1}
I20260504 14:08:14.643927 32650 log.cc:826] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:14.646014 32650 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8: No bootstrap required, opened a new log
I20260504 14:08:14.646256 26619 external_mini_cluster.cc:949] 0 TS(s) registered with all masters
I20260504 14:08:14.648861 32650 raft_consensus.cc:359] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c75bb12cf8c84c5b95fb8aef248c64b8" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44253 } }
I20260504 14:08:14.649078 32650 raft_consensus.cc:385] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:14.649173 32650 raft_consensus.cc:740] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: c75bb12cf8c84c5b95fb8aef248c64b8, State: Initialized, Role: FOLLOWER
I20260504 14:08:14.649567 32650 consensus_queue.cc:260] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c75bb12cf8c84c5b95fb8aef248c64b8" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44253 } }
I20260504 14:08:14.649720 32650 raft_consensus.cc:399] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:08:14.649802 32650 raft_consensus.cc:493] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:08:14.649935 32650 raft_consensus.cc:3060] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:14.650885 32650 raft_consensus.cc:515] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c75bb12cf8c84c5b95fb8aef248c64b8" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44253 } }
I20260504 14:08:14.651261 32650 leader_election.cc:304] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: c75bb12cf8c84c5b95fb8aef248c64b8; no voters: 
I20260504 14:08:14.651566 32650 leader_election.cc:290] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:08:14.651753 32655 raft_consensus.cc:2804] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:08:14.652019 32655 raft_consensus.cc:697] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8 [term 1 LEADER]: Becoming Leader. State: Replica: c75bb12cf8c84c5b95fb8aef248c64b8, State: Running, Role: LEADER
I20260504 14:08:14.652303 32655 consensus_queue.cc:237] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c75bb12cf8c84c5b95fb8aef248c64b8" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44253 } }
I20260504 14:08:14.652711 32650 sys_catalog.cc:565] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8 [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:08:14.654232 32657 sys_catalog.cc:455] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8 [sys.catalog]: SysCatalogTable state changed. Reason: New leader c75bb12cf8c84c5b95fb8aef248c64b8. Latest consensus state: current_term: 1 leader_uuid: "c75bb12cf8c84c5b95fb8aef248c64b8" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c75bb12cf8c84c5b95fb8aef248c64b8" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44253 } } }
I20260504 14:08:14.654373 32657 sys_catalog.cc:458] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8 [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:14.654796 32656 sys_catalog.cc:455] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "c75bb12cf8c84c5b95fb8aef248c64b8" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c75bb12cf8c84c5b95fb8aef248c64b8" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44253 } } }
I20260504 14:08:14.654872 32656 sys_catalog.cc:458] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8 [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:14.655071 32664 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:08:14.657954 32664 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:08:14.663115 32664 catalog_manager.cc:1357] Generated new cluster ID: 47cedf7ee319492faaf8ff0d0c214641
I20260504 14:08:14.663194 32664 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:08:14.675063 32664 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:08:14.675999 32664 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:08:14.688213 32664 catalog_manager.cc:6044] T 00000000000000000000000000000000 P c75bb12cf8c84c5b95fb8aef248c64b8: Generated new TSK 0
I20260504 14:08:14.689006 32664 catalog_manager.cc:1524] Initializing in-progress tserver states...
E20260504 14:08:14.722766 32680 webserver.cc:578] Webserver: sslize failed: error:14094418:SSL routines:ssl3_read_bytes:tlsv1 alert unknown ca
E20260504 14:08:14.722874 32680 webserver.cc:578] Webserver: error shutting down SSL connection: error:140E0197:SSL routines:SSL_shutdown:shutdown while in init
I20260504 14:08:14.724486 32653 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:14.705847 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:60142 (local address 127.25.254.254:44253)
0504 14:08:14.705994 (+   147us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:14.705998 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:14.706145 (+   147us) server_negotiation.cc:408] Connection header received
0504 14:08:14.706364 (+   219us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:14.706368 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:14.706433 (+    65us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:14.706553 (+   120us) server_negotiation.cc:227] Negotiated authn=JWT
0504 14:08:14.706995 (+   442us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:14.707839 (+   844us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:14.708564 (+   725us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:14.708769 (+   205us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:14.708883 (+   114us) server_negotiation.cc:366] Received JWT_EXCHANGE NegotiatePB request
0504 14:08:14.724003 (+ 15120us) server_negotiation.cc:395] Sending RPC error: FATAL_INVALID_JWT: Not authorized: Failed to load JWKS: Error downloading JWKS from 'https://localhost:33845/jwks.json': curl error: SSL peer certificate or SSH remote key was not OK: SSL certificate problem: unable to get local issuer certificate
0504 14:08:14.724334 (+   331us) negotiation.cc:326] Negotiation complete: Not authorized: Server connection negotiation failed: server connection from 127.0.0.1:60142: Failed to load JWKS: Error downloading JWKS from 'https://localhost:33845/jwks.json': curl error: SSL peer certificate or SSH remote key was not OK: SSL certificate problem: unable to get local issuer certificate
Metrics: {"server-negotiator.queue_time_us":61}
W20260504 14:08:14.724639 32653 negotiation.cc:343] Unauthorized connection attempt: Server connection negotiation failed: server connection from 127.0.0.1:60142: Failed to load JWKS: Error downloading JWKS from 'https://localhost:33845/jwks.json': curl error: SSL peer certificate or SSH remote key was not OK: SSL certificate problem: unable to get local issuer certificate
W20260504 14:08:14.724655 32679 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:14.705750 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:44253 (local address 127.0.0.1:60142)
0504 14:08:14.706056 (+   306us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:14.706069 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:14.706184 (+   115us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:14.706578 (+   394us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:14.706581 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:14.706588 (+     7us) client_negotiation.cc:190] Negotiated authn=JWT
0504 14:08:14.706831 (+   243us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:14.706839 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:14.707999 (+  1160us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:14.708003 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:14.708444 (+   441us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:14.708453 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:14.708559 (+   106us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:14.708582 (+    23us) client_negotiation.cc:253] Sending JWT_EXCHANGE NegotiatePB request
0504 14:08:14.724246 (+ 15664us) client_negotiation.cc:284] Received error response from server: Runtime error: FATAL_INVALID_JWT: Not authorized: Failed to load JWKS: Error downloading JWKS from 'https://localhost:33845/jwks.json': curl error: SSL peer certificate or SSH remote key was not OK: SSL certificate problem: unable to get local issuer certificate
0504 14:08:14.724444 (+   198us) negotiation.cc:326] Negotiation complete: Runtime error: Client connection negotiation failed: client connection to 127.25.254.254:44253: FATAL_INVALID_JWT: Not authorized: Failed to load JWKS: Error downloading JWKS from 'https://localhost:33845/jwks.json': curl error: SSL peer certificate or SSH remote key was not OK: SSL certificate problem: unable to get local issuer certificate
Metrics: {"client-negotiator.queue_time_us":206,"thread_start_us":99,"threads_started":1}
I20260504 14:08:14.725315 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 32583
2026-05-04T14:08:14Z chronyd exiting
[       OK ] SecurityITest.TestJwtMiniClusterWithUntrustedCert (317 ms)
[ RUN      ] SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS
2026-05-04T14:08:14Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:08:14Z Disabled control of system clock
I20260504 14:08:14.774701 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:44159
--webserver_interface=127.25.254.254
--webserver_port=0
--builtin_ntp_servers=127.25.254.212:39951
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:44159
--enable_jwt_token_auth=true
--jwks_url=default.url
--encrypt_data_at_rest=true
--rpc_trace_negotiation
--trusted_certificate_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS.1777903638260922-26619-0/testchainca.cert with env {}
W20260504 14:08:14.883457 32684 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:14.883706 32684 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:14.883756 32684 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:14.887219 32684 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:08:14.887296 32684 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:14.887313 32684 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:14.887332 32684 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:08:14.887349 32684 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
W20260504 14:08:14.887368 32684 flags.cc:432] Enabled experimental flag: --enable_jwt_token_auth=true
W20260504 14:08:14.887382 32684 flags.cc:432] Enabled experimental flag: --jwks_url=default.url
I20260504 14:08:14.892429 32684 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:39951
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:44159
--rpc_trace_negotiation=true
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:44159
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--enable_jwt_token_auth=true
--jwks_url=default.url
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--trusted_certificate_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS.1777903638260922-26619-0/testchainca.cert
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.32684
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:14.893599 32684 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:14.894443 32684 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:14.900646 32692 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:14.900672 32689 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:14.900696 32690 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:14.900799 32684 server_base.cc:1061] running on GCE node
I20260504 14:08:14.901722 32684 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:14.902726 32684 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:14.903935 32684 hybrid_clock.cc:648] HybridClock initialized: now 1777903694903911 us; error 49 us; skew 500 ppm
I20260504 14:08:14.906035 32684 webserver.cc:492] Webserver started at http://127.25.254.254:35963/ using document root <none> and password file <none>
I20260504 14:08:14.906689 32684 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:14.906778 32684 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:14.907011 32684 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:14.908722 32684 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "9be1dbf1a17241ebb851619ef80aa7d2"
format_stamp: "Formatted at 2026-05-04 14:08:14 on dist-test-slave-2x32"
server_key: "38c5a98e7b6152c9122df6e1fec8bcd0"
server_key_iv: "c70d30db4db577a64959f830da76d2c1"
server_key_version: "encryptionkey@0"
I20260504 14:08:14.909204 32684 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "9be1dbf1a17241ebb851619ef80aa7d2"
format_stamp: "Formatted at 2026-05-04 14:08:14 on dist-test-slave-2x32"
server_key: "38c5a98e7b6152c9122df6e1fec8bcd0"
server_key_iv: "c70d30db4db577a64959f830da76d2c1"
server_key_version: "encryptionkey@0"
I20260504 14:08:14.912484 32684 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.005s	sys 0.000s
I20260504 14:08:14.914988 32698 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:14.916002 32684 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:08:14.916148 32684 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "9be1dbf1a17241ebb851619ef80aa7d2"
format_stamp: "Formatted at 2026-05-04 14:08:14 on dist-test-slave-2x32"
server_key: "38c5a98e7b6152c9122df6e1fec8bcd0"
server_key_iv: "c70d30db4db577a64959f830da76d2c1"
server_key_version: "encryptionkey@0"
I20260504 14:08:14.916260 32684 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:14.940461 32684 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:14.941424 32684 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:14.941651 32684 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:14.950294 32684 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:44159
I20260504 14:08:14.950314 32750 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:44159 every 8 connection(s)
I20260504 14:08:14.951431 32684 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:08:14.954497 32751 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:14.960047 32751 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2: Bootstrap starting.
I20260504 14:08:14.961395 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 32684
I20260504 14:08:14.961632 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:08:14.961887 26619 external_mini_cluster.cc:1468] Setting key 12ef83a4514b78e33807dccbd4e296fa
I20260504 14:08:14.962852 32751 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:14.963677 32751 log.cc:826] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:14.965821 32751 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2: No bootstrap required, opened a new log
I20260504 14:08:14.969101 32751 raft_consensus.cc:359] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9be1dbf1a17241ebb851619ef80aa7d2" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44159 } }
I20260504 14:08:14.969367 32751 raft_consensus.cc:385] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:14.969478 32751 raft_consensus.cc:740] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 9be1dbf1a17241ebb851619ef80aa7d2, State: Initialized, Role: FOLLOWER
I20260504 14:08:14.969944 32751 consensus_queue.cc:260] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9be1dbf1a17241ebb851619ef80aa7d2" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44159 } }
I20260504 14:08:14.970189 32751 raft_consensus.cc:399] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:08:14.970306 32751 raft_consensus.cc:493] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:08:14.970425 32751 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:14.970451 32754 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:14.962984 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:51138 (local address 127.25.254.254:44159)
0504 14:08:14.963593 (+   609us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:14.963662 (+    69us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:14.963696 (+    34us) server_negotiation.cc:408] Connection header received
0504 14:08:14.964421 (+   725us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:14.964438 (+    17us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:14.964788 (+   350us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:14.965078 (+   290us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:14.965682 (+   604us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:14.966589 (+   907us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:14.967548 (+   959us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:14.967884 (+   336us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:14.968467 (+   583us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:14.968489 (+    22us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:14.968502 (+    13us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:08:14.969073 (+   571us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:14.969112 (+    39us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:08:14.969128 (+    16us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:14.969221 (+    93us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:14.969465 (+   244us) server_negotiation.cc:300] Negotiation successful
0504 14:08:14.969682 (+   217us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":304,"thread_start_us":159,"threads_started":1}
I20260504 14:08:14.971412 32751 raft_consensus.cc:515] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9be1dbf1a17241ebb851619ef80aa7d2" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44159 } }
I20260504 14:08:14.971764 32751 leader_election.cc:304] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 9be1dbf1a17241ebb851619ef80aa7d2; no voters: 
I20260504 14:08:14.972154 32751 leader_election.cc:290] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:08:14.972235 32756 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:08:14.972419 32756 raft_consensus.cc:697] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2 [term 1 LEADER]: Becoming Leader. State: Replica: 9be1dbf1a17241ebb851619ef80aa7d2, State: Running, Role: LEADER
I20260504 14:08:14.972790 32756 consensus_queue.cc:237] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9be1dbf1a17241ebb851619ef80aa7d2" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44159 } }
I20260504 14:08:14.973150 26619 external_mini_cluster.cc:949] 0 TS(s) registered with all masters
I20260504 14:08:14.973376 32751 sys_catalog.cc:565] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2 [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:08:14.973965 32758 sys_catalog.cc:455] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "9be1dbf1a17241ebb851619ef80aa7d2" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9be1dbf1a17241ebb851619ef80aa7d2" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44159 } } }
I20260504 14:08:14.974084 32758 sys_catalog.cc:458] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2 [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:14.974615 32757 sys_catalog.cc:455] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 9be1dbf1a17241ebb851619ef80aa7d2. Latest consensus state: current_term: 1 leader_uuid: "9be1dbf1a17241ebb851619ef80aa7d2" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9be1dbf1a17241ebb851619ef80aa7d2" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44159 } } }
I20260504 14:08:14.974690 32757 sys_catalog.cc:458] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2 [sys.catalog]: This master's current role is: LEADER
W20260504 14:08:14.979161   304 catalog_manager.cc:1568] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2: loading cluster ID for follower catalog manager: Not found: cluster ID entry not found
W20260504 14:08:14.979266   304 catalog_manager.cc:883] Not found: cluster ID entry not found: failed to prepare follower catalog manager, will retry
I20260504 14:08:14.979352 32760 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:08:14.980453 32760 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:08:14.986580 32760 catalog_manager.cc:1357] Generated new cluster ID: b9432f5b64954009ba092e2d4b7bce44
I20260504 14:08:14.986676 32760 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:08:14.997284 32760 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:08:14.998524 32760 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:08:15.014245 32760 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 9be1dbf1a17241ebb851619ef80aa7d2: Generated new TSK 0
I20260504 14:08:15.015062 32760 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20260504 14:08:15.118505 32754 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:15.029821 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:51150 (local address 127.25.254.254:44159)
0504 14:08:15.029987 (+   166us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:15.029991 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:15.030193 (+   202us) server_negotiation.cc:408] Connection header received
0504 14:08:15.030391 (+   198us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:15.030394 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:15.030446 (+    52us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:15.030549 (+   103us) server_negotiation.cc:227] Negotiated authn=JWT
0504 14:08:15.030963 (+   414us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:15.031877 (+   914us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:15.032552 (+   675us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:15.032758 (+   206us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:15.032836 (+    78us) server_negotiation.cc:366] Received JWT_EXCHANGE NegotiatePB request
0504 14:08:15.117950 (+ 85114us) server_negotiation.cc:395] Sending RPC error: FATAL_INVALID_JWT: Not authorized: Failed to load JWKS: Error downloading JWKS from 'default.url': curl error: Could not resolve hostname: Could not resolve host: default.url
0504 14:08:15.118343 (+   393us) negotiation.cc:326] Negotiation complete: Not authorized: Server connection negotiation failed: server connection from 127.0.0.1:51150: Failed to load JWKS: Error downloading JWKS from 'default.url': curl error: Could not resolve hostname: Could not resolve host: default.url
Metrics: {"server-negotiator.queue_time_us":73}
W20260504 14:08:15.118597 32754 negotiation.cc:343] Unauthorized connection attempt: Server connection negotiation failed: server connection from 127.0.0.1:51150: Failed to load JWKS: Error downloading JWKS from 'default.url': curl error: Could not resolve hostname: Could not resolve host: default.url
W20260504 14:08:15.118610   312 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:15.029709 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:44159 (local address 127.0.0.1:51150)
0504 14:08:15.030080 (+   371us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:15.030093 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:15.030233 (+   140us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:15.030555 (+   322us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:15.030558 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:15.030567 (+     9us) client_negotiation.cc:190] Negotiated authn=JWT
0504 14:08:15.030841 (+   274us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:15.030850 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:15.032001 (+  1151us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:15.032006 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:15.032433 (+   427us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:15.032442 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:15.032572 (+   130us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:15.032597 (+    25us) client_negotiation.cc:253] Sending JWT_EXCHANGE NegotiatePB request
0504 14:08:15.118230 (+ 85633us) client_negotiation.cc:284] Received error response from server: Runtime error: FATAL_INVALID_JWT: Not authorized: Failed to load JWKS: Error downloading JWKS from 'default.url': curl error: Could not resolve hostname: Could not resolve host: default.url
0504 14:08:15.118436 (+   206us) negotiation.cc:326] Negotiation complete: Runtime error: Client connection negotiation failed: client connection to 127.25.254.254:44159: FATAL_INVALID_JWT: Not authorized: Failed to load JWKS: Error downloading JWKS from 'default.url': curl error: Could not resolve hostname: Could not resolve host: default.url
Metrics: {"client-negotiator.queue_time_us":267,"thread_start_us":112,"threads_started":1}
I20260504 14:08:15.119347 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 32684
2026-05-04T14:08:15Z chronyd exiting
[       OK ] SecurityITest.TestJwtMiniClusterWithNonWorkingJWKS (373 ms)
[ RUN      ] SecurityITest.TestWorldReadableKeytab
[       OK ] SecurityITest.TestWorldReadableKeytab (121 ms)
[ RUN      ] SecurityITest.TestWorldReadablePrivateKey
[       OK ] SecurityITest.TestWorldReadablePrivateKey (122 ms)
[ RUN      ] SecurityITest.TestCorruptKerberosCC
Loading random data
Initializing database '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/principal' for realm 'KRBTEST.COM',
master key name 'K/M@KRBTEST.COM'
May 04 14:08:15 dist-test-slave-2x32 krb5kdc[324](info): setting up network...
krb5kdc: setsockopt(10,IPV6_V6ONLY,1) worked
May 04 14:08:15 dist-test-slave-2x32 krb5kdc[324](info): set up 2 sockets
May 04 14:08:15 dist-test-slave-2x32 krb5kdc[324](info): commencing operation
krb5kdc: starting...
W20260504 14:08:17.434859 26619 mini_kdc.cc:121] Time spent starting KDC: real 2.050s	user 0.000s	sys 0.006s
WARNING: no policy specified for test-admin@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-admin@KRBTEST.COM" created.
WARNING: no policy specified for test-user@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-user@KRBTEST.COM" created.
WARNING: no policy specified for joe-interloper@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "joe-interloper@KRBTEST.COM" created.
Authenticating as principal slave/admin@KRBTEST.COM with password.
Entry for principal test-user with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/test-user.keytab.
Entry for principal test-user with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/test-user.keytab.
May 04 14:08:17 dist-test-slave-2x32 krb5kdc[324](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903697, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
2026-05-04T14:08:17Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:08:17Z Disabled control of system clock
WARNING: no policy specified for kudu/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:08:17.595594 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:46055
--webserver_interface=127.25.254.254
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:45163
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:46055
--encrypt_data_at_rest=true
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:17.706103   340 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:17.706441   340 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:17.706502   340 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:17.710291   340 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:08:17.710371   340 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:17.710398   340 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:17.710419   340 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:08:17.710439   340 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:08:17.715116   340 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:45163
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:46055
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:46055
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.340
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:17.716183   340 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:17.716988   340 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:17.722802   346 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:17.722774   348 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:17.722898   345 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:17.723059   340 server_base.cc:1061] running on GCE node
I20260504 14:08:17.723933   340 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:17.724951   340 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:17.726137   340 hybrid_clock.cc:648] HybridClock initialized: now 1777903697726119 us; error 33 us; skew 500 ppm
May 04 14:08:17 dist-test-slave-2x32 krb5kdc[324](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903697, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:17.729301   340 init.cc:377] Logged in from keytab as kudu/127.25.254.254@KRBTEST.COM (short username kudu)
I20260504 14:08:17.730460   340 webserver.cc:492] Webserver started at http://127.25.254.254:36133/ using document root <none> and password file <none>
I20260504 14:08:17.731005   340 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:17.731052   340 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:17.731272   340 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:17.733014   340 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "4bdc1722999746fbbfa6b0f7027cd719"
format_stamp: "Formatted at 2026-05-04 14:08:17 on dist-test-slave-2x32"
server_key: "521d2b7027ac89685ca99bb44aa83bca"
server_key_iv: "70a98ced8d3448aed0badf93a35a1f51"
server_key_version: "encryptionkey@0"
I20260504 14:08:17.733512   340 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "4bdc1722999746fbbfa6b0f7027cd719"
format_stamp: "Formatted at 2026-05-04 14:08:17 on dist-test-slave-2x32"
server_key: "521d2b7027ac89685ca99bb44aa83bca"
server_key_iv: "70a98ced8d3448aed0badf93a35a1f51"
server_key_version: "encryptionkey@0"
I20260504 14:08:17.737028   340 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.005s	sys 0.000s
I20260504 14:08:17.739521   355 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:17.740617   340 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.001s	sys 0.000s
I20260504 14:08:17.740774   340 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "4bdc1722999746fbbfa6b0f7027cd719"
format_stamp: "Formatted at 2026-05-04 14:08:17 on dist-test-slave-2x32"
server_key: "521d2b7027ac89685ca99bb44aa83bca"
server_key_iv: "70a98ced8d3448aed0badf93a35a1f51"
server_key_version: "encryptionkey@0"
I20260504 14:08:17.740895   340 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:17.757026   340 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:17.765229   340 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:17.765457   340 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:17.774514   340 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:46055
I20260504 14:08:17.774515   407 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:46055 every 8 connection(s)
I20260504 14:08:17.775615   340 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:08:17.778519   408 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:17.782135 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 340
I20260504 14:08:17.782341 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:08:17.782620 26619 external_mini_cluster.cc:1468] Setting key 7837015a0d86a3427683b19e608211e0
I20260504 14:08:17.784612   408 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719: Bootstrap starting.
I20260504 14:08:17.786860   408 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:17.787595   408 log.cc:826] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:17.789565   408 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719: No bootstrap required, opened a new log
May 04 14:08:17 dist-test-slave-2x32 krb5kdc[324](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903697, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:08:17.792557   408 raft_consensus.cc:359] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4bdc1722999746fbbfa6b0f7027cd719" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 46055 } }
I20260504 14:08:17.792779   408 raft_consensus.cc:385] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:17.792866   408 raft_consensus.cc:740] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4bdc1722999746fbbfa6b0f7027cd719, State: Initialized, Role: FOLLOWER
I20260504 14:08:17.793358   408 consensus_queue.cc:260] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4bdc1722999746fbbfa6b0f7027cd719" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 46055 } }
I20260504 14:08:17.793502   408 raft_consensus.cc:399] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:08:17.793582   408 raft_consensus.cc:493] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:08:17.793708   408 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:17.794651   408 raft_consensus.cc:515] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4bdc1722999746fbbfa6b0f7027cd719" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 46055 } }
I20260504 14:08:17.795039   408 leader_election.cc:304] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 4bdc1722999746fbbfa6b0f7027cd719; no voters: 
I20260504 14:08:17.795399   408 leader_election.cc:290] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:08:17.795845   413 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:08:17.796046   411 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:17.784016 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:47228 (local address 127.25.254.254:46055)
0504 14:08:17.784478 (+   462us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:17.784488 (+    10us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:17.784522 (+    34us) server_negotiation.cc:408] Connection header received
0504 14:08:17.785183 (+   661us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:17.785200 (+    17us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:17.785494 (+   294us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:17.785836 (+   342us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:17.786810 (+   974us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:17.787599 (+   789us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:17.788295 (+   696us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:17.788604 (+   309us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:17.790980 (+  2376us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:17.791004 (+    24us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:17.791016 (+    12us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:17.791047 (+    31us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:17.793093 (+  2046us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:17.793550 (+   457us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:17.793556 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:17.793562 (+     6us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:17.793684 (+   122us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:17.793954 (+   270us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:17.793958 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:17.793960 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:17.794371 (+   411us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:17.794567 (+   196us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:17.794902 (+   335us) server_negotiation.cc:300] Negotiation successful
0504 14:08:17.795172 (+   270us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":322,"thread_start_us":147,"threads_started":1}
I20260504 14:08:17.796053   413 raft_consensus.cc:697] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719 [term 1 LEADER]: Becoming Leader. State: Replica: 4bdc1722999746fbbfa6b0f7027cd719, State: Running, Role: LEADER
I20260504 14:08:17.796386   413 consensus_queue.cc:237] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4bdc1722999746fbbfa6b0f7027cd719" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 46055 } }
I20260504 14:08:17.796705   408 sys_catalog.cc:565] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719 [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:08:17.797942   414 sys_catalog.cc:455] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "4bdc1722999746fbbfa6b0f7027cd719" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4bdc1722999746fbbfa6b0f7027cd719" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 46055 } } }
I20260504 14:08:17.798267   414 sys_catalog.cc:458] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719 [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:17.798135   415 sys_catalog.cc:455] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 4bdc1722999746fbbfa6b0f7027cd719. Latest consensus state: current_term: 1 leader_uuid: "4bdc1722999746fbbfa6b0f7027cd719" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4bdc1722999746fbbfa6b0f7027cd719" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 46055 } } }
I20260504 14:08:17.798609   415 sys_catalog.cc:458] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719 [sys.catalog]: This master's current role is: LEADER
W20260504 14:08:17.801736   428 catalog_manager.cc:1568] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719: loading cluster ID for follower catalog manager: Not found: cluster ID entry not found
W20260504 14:08:17.801808   428 catalog_manager.cc:883] Not found: cluster ID entry not found: failed to prepare follower catalog manager, will retry
I20260504 14:08:17.801898   423 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:08:17.802726   423 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:08:17.808574   423 catalog_manager.cc:1357] Generated new cluster ID: f1348a58687f43b7b371507401da52a8
I20260504 14:08:17.808686   423 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:08:17.822126   423 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:08:17.823143   423 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:08:17.832211   423 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 4bdc1722999746fbbfa6b0f7027cd719: Generated new TSK 0
I20260504 14:08:17.832945   423 catalog_manager.cc:1524] Initializing in-progress tserver states...
WARNING: no policy specified for kudu/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:08:17.899269 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:0
--local_ip_for_outbound_sockets=127.25.254.193
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:46055
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:45163
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:18.010255   436 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:18.010485   436 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:18.010545   436 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:18.014039   436 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:18.014120   436 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:18.014278   436 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:08:18.018889   436 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:45163
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-0/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:46055
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.436
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:18.019980   436 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:18.020783   436 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:18.027482   444 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:18.027458   441 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:18.027464   442 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:18.027534   436 server_base.cc:1061] running on GCE node
I20260504 14:08:18.028064   436 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:18.028645   436 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:18.029834   436 hybrid_clock.cc:648] HybridClock initialized: now 1777903698029825 us; error 30 us; skew 500 ppm
May 04 14:08:18 dist-test-slave-2x32 krb5kdc[324](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903698, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:18.032583   436 init.cc:377] Logged in from keytab as kudu/127.25.254.193@KRBTEST.COM (short username kudu)
I20260504 14:08:18.033665   436 webserver.cc:492] Webserver started at http://127.25.254.193:34011/ using document root <none> and password file <none>
I20260504 14:08:18.034267   436 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:18.034343   436 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:18.034538   436 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:18.036300   436 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-0/data/instance:
uuid: "f50908480a4e42899dce2a0cf47bfedf"
format_stamp: "Formatted at 2026-05-04 14:08:18 on dist-test-slave-2x32"
server_key: "7daaecbbcf37e15441717b3856291696"
server_key_iv: "01fd25b794bbc7a00a8bbf7f789f65f3"
server_key_version: "encryptionkey@0"
I20260504 14:08:18.036813   436 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance:
uuid: "f50908480a4e42899dce2a0cf47bfedf"
format_stamp: "Formatted at 2026-05-04 14:08:18 on dist-test-slave-2x32"
server_key: "7daaecbbcf37e15441717b3856291696"
server_key_iv: "01fd25b794bbc7a00a8bbf7f789f65f3"
server_key_version: "encryptionkey@0"
I20260504 14:08:18.040295   436 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.005s	sys 0.000s
I20260504 14:08:18.042619   451 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:18.043643   436 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:08:18.043781   436 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "f50908480a4e42899dce2a0cf47bfedf"
format_stamp: "Formatted at 2026-05-04 14:08:18 on dist-test-slave-2x32"
server_key: "7daaecbbcf37e15441717b3856291696"
server_key_iv: "01fd25b794bbc7a00a8bbf7f789f65f3"
server_key_version: "encryptionkey@0"
I20260504 14:08:18.043893   436 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:18.059489   436 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:18.062747   436 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:18.062985   436 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:18.063592   436 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:18.064579   436 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:18.064657   436 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:18.064735   436 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:18.064803   436 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:18.075681   436 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:41167
I20260504 14:08:18.075695   564 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:41167 every 8 connection(s)
I20260504 14:08:18.076665   436 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
May 04 14:08:18 dist-test-slave-2x32 krb5kdc[324](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903698, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:08:18.086006 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 436
I20260504 14:08:18.086144 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance
I20260504 14:08:18.086498 26619 external_mini_cluster.cc:1468] Setting key 5780c691e51dcb7e6b5b51127c033cbc
I20260504 14:08:18.089581   411 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:18.078523 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:52057 (local address 127.25.254.254:46055)
0504 14:08:18.078741 (+   218us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:18.078747 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:18.079392 (+   645us) server_negotiation.cc:408] Connection header received
0504 14:08:18.080216 (+   824us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:18.080219 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:18.080270 (+    51us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:18.080346 (+    76us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:18.081981 (+  1635us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:18.082598 (+   617us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:18.083296 (+   698us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:18.083431 (+   135us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:18.085836 (+  2405us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:18.085863 (+    27us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:18.085869 (+     6us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:18.085907 (+    38us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:18.087632 (+  1725us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:18.088155 (+   523us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:18.088158 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:18.088160 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:18.088211 (+    51us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:18.088530 (+   319us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:18.088533 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:18.088535 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:18.088688 (+   153us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:18.088795 (+   107us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:18.089283 (+   488us) server_negotiation.cc:300] Negotiation successful
0504 14:08:18.089411 (+   128us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":66}
I20260504 14:08:18.090323   567 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:18.078800 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:46055 (local address 127.25.254.193:52057)
0504 14:08:18.079243 (+   443us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:18.079285 (+    42us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:18.080039 (+   754us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:18.080495 (+   456us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:18.080502 (+     7us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:18.080912 (+   410us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:18.081832 (+   920us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:18.081846 (+    14us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:18.082769 (+   923us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:18.082772 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:18.083184 (+   412us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:18.083191 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:18.083365 (+   174us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:18.083897 (+   532us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:18.083915 (+    18us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:18.085646 (+  1731us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:18.087789 (+  2143us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:18.087795 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:18.087805 (+    10us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:18.088046 (+   241us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:18.088317 (+   271us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:18.088320 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:18.088321 (+     1us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:18.088426 (+   105us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:18.088796 (+   370us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:18.088805 (+     9us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:18.089040 (+   235us) client_negotiation.cc:770] Sending connection context
0504 14:08:18.089247 (+   207us) client_negotiation.cc:241] Negotiation successful
0504 14:08:18.089485 (+   238us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":267,"thread_start_us":111,"threads_started":1}
I20260504 14:08:18.091614   565 heartbeater.cc:344] Connected to a master server at 127.25.254.254:46055
I20260504 14:08:18.091889   565 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:18.092399   565 heartbeater.cc:507] Master 127.25.254.254:46055 requested a full tablet report, sending...
I20260504 14:08:18.094115   372 ts_manager.cc:194] Registered new tserver with Master: f50908480a4e42899dce2a0cf47bfedf (127.25.254.193:41167)
WARNING: no policy specified for kudu/127.25.254.194@KRBTEST.COM; defaulting to no policy
I20260504 14:08:18.095907   372 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.193@KRBTEST.COM'} at 127.25.254.193:52057
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:08:18.142009 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:0
--local_ip_for_outbound_sockets=127.25.254.194
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:46055
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:45163
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:18.248564   572 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:18.248883   572 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:18.248972   572 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:18.252524   572 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:18.252605   572 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:18.252730   572 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:08:18.258226   572 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:45163
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-1/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:46055
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.572
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:18.259649   572 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:18.260591   572 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:18.267395   577 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:18.267419   580 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:18.267407   578 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:18.268519   572 server_base.cc:1061] running on GCE node
I20260504 14:08:18.269043   572 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:18.269677   572 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:18.270877   572 hybrid_clock.cc:648] HybridClock initialized: now 1777903698270848 us; error 40 us; skew 500 ppm
May 04 14:08:18 dist-test-slave-2x32 krb5kdc[324](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903698, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:18.273895   572 init.cc:377] Logged in from keytab as kudu/127.25.254.194@KRBTEST.COM (short username kudu)
I20260504 14:08:18.275045   572 webserver.cc:492] Webserver started at http://127.25.254.194:37851/ using document root <none> and password file <none>
I20260504 14:08:18.275615   572 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:18.275691   572 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:18.275899   572 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:18.277779   572 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-1/data/instance:
uuid: "35de202e142142948120dcaec1034779"
format_stamp: "Formatted at 2026-05-04 14:08:18 on dist-test-slave-2x32"
server_key: "b8b07b91f632dfd164975a4d1c8ade75"
server_key_iv: "61e41fc3c64b05cadbeffdbd6e4b22d7"
server_key_version: "encryptionkey@0"
I20260504 14:08:18.278370   572 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance:
uuid: "35de202e142142948120dcaec1034779"
format_stamp: "Formatted at 2026-05-04 14:08:18 on dist-test-slave-2x32"
server_key: "b8b07b91f632dfd164975a4d1c8ade75"
server_key_iv: "61e41fc3c64b05cadbeffdbd6e4b22d7"
server_key_version: "encryptionkey@0"
I20260504 14:08:18.281898   572 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.002s	sys 0.001s
I20260504 14:08:18.284545   587 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:18.285914   572 fs_manager.cc:730] Time spent opening block manager: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:08:18.286075   572 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "35de202e142142948120dcaec1034779"
format_stamp: "Formatted at 2026-05-04 14:08:18 on dist-test-slave-2x32"
server_key: "b8b07b91f632dfd164975a4d1c8ade75"
server_key_iv: "61e41fc3c64b05cadbeffdbd6e4b22d7"
server_key_version: "encryptionkey@0"
I20260504 14:08:18.286229   572 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:18.296798   572 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:18.300071   572 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:18.300284   572 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:18.300880   572 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:18.301759   572 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:18.301836   572 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:18.301903   572 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:18.301949   572 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:18.311750   572 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:34669
I20260504 14:08:18.311774   700 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:34669 every 8 connection(s)
I20260504 14:08:18.312714   572 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
I20260504 14:08:18.318186 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 572
I20260504 14:08:18.318332 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance
I20260504 14:08:18.318571 26619 external_mini_cluster.cc:1468] Setting key 929a51bbdc18f5fb4ebd706736a0f45f
May 04 14:08:18 dist-test-slave-2x32 krb5kdc[324](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903698, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:08:18.326931   411 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:18.314587 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:47317 (local address 127.25.254.254:46055)
0504 14:08:18.314742 (+   155us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:18.314746 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:18.315414 (+   668us) server_negotiation.cc:408] Connection header received
0504 14:08:18.316346 (+   932us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:18.316350 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:18.316412 (+    62us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:18.316513 (+   101us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:18.318248 (+  1735us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:18.318973 (+   725us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:18.319926 (+   953us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:18.320139 (+   213us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:18.322831 (+  2692us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:18.322856 (+    25us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:18.322862 (+     6us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:18.322898 (+    36us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:18.324503 (+  1605us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:18.325207 (+   704us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:18.325210 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:18.325211 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:18.325257 (+    46us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:18.325695 (+   438us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:18.325699 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:18.325700 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:18.325851 (+   151us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:18.325948 (+    97us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:18.326645 (+   697us) server_negotiation.cc:300] Negotiation successful
0504 14:08:18.326763 (+   118us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":60}
I20260504 14:08:18.327853   703 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:18.314865 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:46055 (local address 127.25.254.194:47317)
0504 14:08:18.315274 (+   409us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:18.315311 (+    37us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:18.316141 (+   830us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:18.316678 (+   537us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:18.316686 (+     8us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:18.317063 (+   377us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:18.317992 (+   929us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:18.318012 (+    20us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:18.319149 (+  1137us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:18.319153 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:18.319684 (+   531us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:18.319693 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:18.319937 (+   244us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:18.320556 (+   619us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:18.320596 (+    40us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:18.322628 (+  2032us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:18.324647 (+  2019us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:18.324657 (+    10us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:18.324672 (+    15us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:18.325089 (+   417us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:18.325369 (+   280us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:18.325375 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:18.325403 (+    28us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:18.325577 (+   174us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:18.325959 (+   382us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:18.325967 (+     8us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:18.326349 (+   382us) client_negotiation.cc:770] Sending connection context
0504 14:08:18.326608 (+   259us) client_negotiation.cc:241] Negotiation successful
0504 14:08:18.326918 (+   310us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":240,"thread_start_us":107,"threads_started":1}
WARNING: no policy specified for kudu/127.25.254.195@KRBTEST.COM; defaulting to no policy
I20260504 14:08:18.329078   701 heartbeater.cc:344] Connected to a master server at 127.25.254.254:46055
I20260504 14:08:18.329350   701 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:18.329906   701 heartbeater.cc:507] Master 127.25.254.254:46055 requested a full tablet report, sending...
I20260504 14:08:18.331193   372 ts_manager.cc:194] Registered new tserver with Master: 35de202e142142948120dcaec1034779 (127.25.254.194:34669)
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.195@KRBTEST.COM" created.
I20260504 14:08:18.332002   372 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.194@KRBTEST.COM'} at 127.25.254.194:47317
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:08:18.377467 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:0
--local_ip_for_outbound_sockets=127.25.254.195
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:46055
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:45163
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:18.489671   708 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:18.489897   708 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:18.489962   708 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:18.493450   708 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:18.493521   708 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:18.493610   708 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:08:18.498133   708 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:45163
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-2/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:46055
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.708
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:18.499346   708 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:18.500183   708 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:18.506815   716 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:18.506915   714 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:18.506958   713 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:18.507351   708 server_base.cc:1061] running on GCE node
I20260504 14:08:18.507704   708 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:18.508252   708 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:18.509476   708 hybrid_clock.cc:648] HybridClock initialized: now 1777903698509457 us; error 31 us; skew 500 ppm
May 04 14:08:18 dist-test-slave-2x32 krb5kdc[324](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903698, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:18.512652   708 init.cc:377] Logged in from keytab as kudu/127.25.254.195@KRBTEST.COM (short username kudu)
I20260504 14:08:18.513916   708 webserver.cc:492] Webserver started at http://127.25.254.195:36073/ using document root <none> and password file <none>
I20260504 14:08:18.514577   708 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:18.514662   708 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:18.514894   708 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:18.516645   708 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-2/data/instance:
uuid: "a01abad369d4451b8e403acd87eb359a"
format_stamp: "Formatted at 2026-05-04 14:08:18 on dist-test-slave-2x32"
server_key: "d63f00c873bf2579a4649c1c4e9962ab"
server_key_iv: "9c88c8ce6280dd24eb7919a07bc0a981"
server_key_version: "encryptionkey@0"
I20260504 14:08:18.517148   708 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance:
uuid: "a01abad369d4451b8e403acd87eb359a"
format_stamp: "Formatted at 2026-05-04 14:08:18 on dist-test-slave-2x32"
server_key: "d63f00c873bf2579a4649c1c4e9962ab"
server_key_iv: "9c88c8ce6280dd24eb7919a07bc0a981"
server_key_version: "encryptionkey@0"
I20260504 14:08:18.520725   708 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.006s	sys 0.000s
I20260504 14:08:18.523200   723 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:18.524256   708 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:08:18.524398   708 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "a01abad369d4451b8e403acd87eb359a"
format_stamp: "Formatted at 2026-05-04 14:08:18 on dist-test-slave-2x32"
server_key: "d63f00c873bf2579a4649c1c4e9962ab"
server_key_iv: "9c88c8ce6280dd24eb7919a07bc0a981"
server_key_version: "encryptionkey@0"
I20260504 14:08:18.524510   708 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:18.561856   708 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:18.565395   708 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:18.565631   708 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:18.566280   708 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:18.567207   708 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:18.567279   708 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:18.567349   708 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:18.567391   708 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:18.577111   708 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:39101
I20260504 14:08:18.577135   836 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:39101 every 8 connection(s)
I20260504 14:08:18.578080   708 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
I20260504 14:08:18.584584 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 708
I20260504 14:08:18.584712 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance
I20260504 14:08:18.585012 26619 external_mini_cluster.cc:1468] Setting key fc152ae259950f538e4eb63664b34881
May 04 14:08:18 dist-test-slave-2x32 krb5kdc[324](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903698, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:08:18.591646   411 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:18.580036 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:60829 (local address 127.25.254.254:46055)
0504 14:08:18.580190 (+   154us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:18.580194 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:18.580895 (+   701us) server_negotiation.cc:408] Connection header received
0504 14:08:18.581802 (+   907us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:18.581807 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:18.581867 (+    60us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:18.582003 (+   136us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:18.583587 (+  1584us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:18.584100 (+   513us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:18.584986 (+   886us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:18.585219 (+   233us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:18.587951 (+  2732us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:18.587965 (+    14us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:18.587967 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:18.587994 (+    27us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:18.589473 (+  1479us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:18.589988 (+   515us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:18.589991 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:18.589993 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:18.590039 (+    46us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:18.590446 (+   407us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:18.590449 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:18.590450 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:18.590722 (+   272us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:18.590839 (+   117us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:18.591396 (+   557us) server_negotiation.cc:300] Negotiation successful
0504 14:08:18.591506 (+   110us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":58}
I20260504 14:08:18.592317   839 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:18.580314 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:46055 (local address 127.25.254.195:60829)
0504 14:08:18.580753 (+   439us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:18.580786 (+    33us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:18.581557 (+   771us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:18.582190 (+   633us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:18.582200 (+    10us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:18.582574 (+   374us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:18.583427 (+   853us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:18.583440 (+    13us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:18.584245 (+   805us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:18.584249 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:18.584828 (+   579us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:18.584840 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:18.585061 (+   221us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:18.585762 (+   701us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:18.585787 (+    25us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:18.587795 (+  2008us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:18.589599 (+  1804us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:18.589604 (+     5us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:18.589615 (+    11us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:18.589864 (+   249us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:18.590137 (+   273us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:18.590139 (+     2us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:18.590141 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:18.590312 (+   171us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:18.590848 (+   536us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:18.590853 (+     5us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:18.591142 (+   289us) client_negotiation.cc:770] Sending connection context
0504 14:08:18.591314 (+   172us) client_negotiation.cc:241] Negotiation successful
0504 14:08:18.591530 (+   216us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":258,"thread_start_us":112,"threads_started":1}
I20260504 14:08:18.593670   837 heartbeater.cc:344] Connected to a master server at 127.25.254.254:46055
I20260504 14:08:18.593914   837 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:18.594480   837 heartbeater.cc:507] Master 127.25.254.254:46055 requested a full tablet report, sending...
I20260504 14:08:18.595597   372 ts_manager.cc:194] Registered new tserver with Master: a01abad369d4451b8e403acd87eb359a (127.25.254.195:39101)
I20260504 14:08:18.596231   372 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.195@KRBTEST.COM'} at 127.25.254.195:60829
I20260504 14:08:18.600065 26619 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal test-admin with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/test-admin.keytab.
Entry for principal test-admin with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/test-admin.keytab.
May 04 14:08:18 dist-test-slave-2x32 krb5kdc[324](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903698, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:18.620170 26619 init.cc:377] Logged in from keytab as test-admin@KRBTEST.COM (short username test-admin)
W20260504 14:08:18.623958   844 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:18.622109 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:48820 (local address 127.25.254.193:41167)
0504 14:08:18.622543 (+   434us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:18.622553 (+    10us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:18.622582 (+    29us) server_negotiation.cc:408] Connection header received
0504 14:08:18.622658 (+    76us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:18.622664 (+     6us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:18.622918 (+   254us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:18.623080 (+   162us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:18.623766 (+   686us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.0.0.1:48820: BlockingRecv error: recv got EOF from 127.0.0.1:48820 (error 108)
Metrics: {"server-negotiator.queue_time_us":274,"thread_start_us":151,"threads_started":1}
W20260504 14:08:18.624128 26619 init.cc:300] Could not find kerberos principal in credential cache '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/krb5cc' of type FILE
May 04 14:08:18 dist-test-slave-2x32 krb5kdc[324](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903698, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:18.625303 26619 init.cc:377] Logged in from keytab as test-admin@KRBTEST.COM (short username test-admin)
May 04 14:08:18 dist-test-slave-2x32 krb5kdc[324](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903698, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:18.626350 26619 init.cc:295] Successfully reacquired a new kerberos TGT
May 04 14:08:18 dist-test-slave-2x32 krb5kdc[324](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903698, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.193@KRBTEST.COM
I20260504 14:08:18.637493   844 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:18.627446 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:48836 (local address 127.25.254.193:41167)
0504 14:08:18.627618 (+   172us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:18.627621 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:18.627756 (+   135us) server_negotiation.cc:408] Connection header received
0504 14:08:18.627871 (+   115us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:18.627873 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:18.627926 (+    53us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:18.627998 (+    72us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:18.628748 (+   750us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:18.629596 (+   848us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:18.630513 (+   917us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:18.630811 (+   298us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:18.633007 (+  2196us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:18.633035 (+    28us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:18.633046 (+    11us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:18.633090 (+    44us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:18.635503 (+  2413us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:18.636032 (+   529us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:18.636038 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:18.636039 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:18.636100 (+    61us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:18.636449 (+   349us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:18.636452 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:18.636453 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:18.636677 (+   224us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:18.636856 (+   179us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:18.637106 (+   250us) server_negotiation.cc:300] Negotiation successful
0504 14:08:18.637304 (+   198us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":77}
W20260504 14:08:18.641649   844 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:18.640439 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:48846 (local address 127.25.254.193:41167)
0504 14:08:18.640619 (+   180us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:18.640622 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:18.640635 (+    13us) server_negotiation.cc:408] Connection header received
0504 14:08:18.640864 (+   229us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:18.640867 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:18.640913 (+    46us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:18.640988 (+    75us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:18.641511 (+   523us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.0.0.1:48846: BlockingRecv error: recv got EOF from 127.0.0.1:48846 (error 108)
Metrics: {"server-negotiator.queue_time_us":71}
W20260504 14:08:18.641737 26619 init.cc:300] Could not find kerberos principal in credential cache '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/krb5cc' of type FILE
May 04 14:08:18 dist-test-slave-2x32 krb5kdc[324](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903698, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:18.642930 26619 init.cc:377] Logged in from keytab as test-admin@KRBTEST.COM (short username test-admin)
May 04 14:08:18 dist-test-slave-2x32 krb5kdc[324](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903698, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:18.643921 26619 init.cc:295] Successfully reacquired a new kerberos TGT
May 04 14:08:18 dist-test-slave-2x32 krb5kdc[324](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903698, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.193@KRBTEST.COM
I20260504 14:08:18.653179   844 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:18.644802 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:48852 (local address 127.25.254.193:41167)
0504 14:08:18.644934 (+   132us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:18.644937 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:18.645078 (+   141us) server_negotiation.cc:408] Connection header received
0504 14:08:18.645137 (+    59us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:18.645142 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:18.645200 (+    58us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:18.645273 (+    73us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:18.646022 (+   749us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:18.646629 (+   607us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:18.647308 (+   679us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:18.647459 (+   151us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:18.649759 (+  2300us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:18.649779 (+    20us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:18.649781 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:18.649807 (+    26us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:18.651339 (+  1532us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:18.651890 (+   551us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:18.651893 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:18.651895 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:18.651964 (+    69us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:18.652234 (+   270us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:18.652237 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:18.652238 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:18.652370 (+   132us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:18.652474 (+   104us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:18.652836 (+   362us) server_negotiation.cc:300] Negotiation successful
0504 14:08:18.653011 (+   175us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":57}
W20260504 14:08:18.656178   844 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:18.655156 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:48866 (local address 127.25.254.193:41167)
0504 14:08:18.655297 (+   141us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:18.655300 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:18.655313 (+    13us) server_negotiation.cc:408] Connection header received
0504 14:08:18.655459 (+   146us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:18.655462 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:18.655505 (+    43us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:18.655574 (+    69us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:18.656015 (+   441us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.0.0.1:48866: BlockingRecv error: recv got EOF from 127.0.0.1:48866 (error 108)
Metrics: {"server-negotiator.queue_time_us":53}
W20260504 14:08:18.656275 26619 init.cc:300] Could not find kerberos principal in credential cache '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestCorruptKerberosCC.1777903638260922-26619-0/krb5kdc/krb5cc' of type FILE
May 04 14:08:18 dist-test-slave-2x32 krb5kdc[324](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903698, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:18.657423 26619 init.cc:377] Logged in from keytab as test-admin@KRBTEST.COM (short username test-admin)
May 04 14:08:18 dist-test-slave-2x32 krb5kdc[324](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903698, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:18.658533 26619 init.cc:295] Successfully reacquired a new kerberos TGT
May 04 14:08:18 dist-test-slave-2x32 krb5kdc[324](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903698, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.193@KRBTEST.COM
I20260504 14:08:18.667611   844 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:18.659538 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:48880 (local address 127.25.254.193:41167)
0504 14:08:18.659694 (+   156us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:18.659698 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:18.659760 (+    62us) server_negotiation.cc:408] Connection header received
0504 14:08:18.659863 (+   103us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:18.659865 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:18.659906 (+    41us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:18.659975 (+    69us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:18.660601 (+   626us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:18.661110 (+   509us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:18.661847 (+   737us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:18.662051 (+   204us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:18.664306 (+  2255us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:18.664325 (+    19us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:18.664327 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:18.664356 (+    29us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:18.665898 (+  1542us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:18.666364 (+   466us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:18.666367 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:18.666369 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:18.666415 (+    46us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:18.666701 (+   286us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:18.666704 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:18.666706 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:18.666892 (+   186us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:18.666997 (+   105us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:18.667277 (+   280us) server_negotiation.cc:300] Negotiation successful
0504 14:08:18.667415 (+   138us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":37}
I20260504 14:08:18.668901 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 436
I20260504 14:08:18.674847 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 572
I20260504 14:08:18.680645 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 708
I20260504 14:08:18.686760 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 340
2026-05-04T14:08:18Z chronyd exiting
[       OK ] SecurityITest.TestCorruptKerberosCC (3325 ms)
[ RUN      ] SecurityITest.TestNonDefaultPrincipalMultipleMaster
Loading random data
Initializing database '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/principal' for realm 'KRBTEST.COM',
master key name 'K/M@KRBTEST.COM'
May 04 14:08:18 dist-test-slave-2x32 krb5kdc[857](info): setting up network...
krb5kdc: setsockopt(10,IPV6_V6ONLY,1) worked
May 04 14:08:18 dist-test-slave-2x32 krb5kdc[857](info): set up 2 sockets
May 04 14:08:18 dist-test-slave-2x32 krb5kdc[857](info): commencing operation
krb5kdc: starting...
W20260504 14:08:20.734983 26619 mini_kdc.cc:121] Time spent starting KDC: real 2.025s	user 0.000s	sys 0.006s
WARNING: no policy specified for test-admin@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-admin@KRBTEST.COM" created.
WARNING: no policy specified for test-user@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-user@KRBTEST.COM" created.
WARNING: no policy specified for joe-interloper@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "joe-interloper@KRBTEST.COM" created.
Authenticating as principal slave/admin@KRBTEST.COM with password.
Entry for principal test-user with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/test-user.keytab.
Entry for principal test-user with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/test-user.keytab.
May 04 14:08:20 dist-test-slave-2x32 krb5kdc[857](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903700, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
2026-05-04T14:08:20Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:08:20Z Disabled control of system clock
WARNING: no policy specified for oryx/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "oryx/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal oryx/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal oryx/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
WARNING: no policy specified for HTTP/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
I20260504 14:08:20.895327 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:41049
--webserver_interface=127.25.254.254
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.254
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:37991
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:41049
--encrypt_data_at_rest=true
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:21.003500   873 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:21.003790   873 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:21.003861   873 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:21.008107   873 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:08:21.008196   873 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:21.008222   873 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:21.008242   873 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:08:21.008263   873 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:08:21.013387   873 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:37991
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:41049
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.254
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:41049
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.873
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:21.014612   873 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:21.015568   873 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:21.021716   879 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:21.021698   878 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:21.021816   881 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:21.022277   873 server_base.cc:1061] running on GCE node
I20260504 14:08:21.022939   873 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:21.023999   873 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:21.025195   873 hybrid_clock.cc:648] HybridClock initialized: now 1777903701025152 us; error 59 us; skew 500 ppm
May 04 14:08:21 dist-test-slave-2x32 krb5kdc[857](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903701, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.254@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:21.028359   873 init.cc:377] Logged in from keytab as oryx/127.25.254.254@KRBTEST.COM (short username oryx)
I20260504 14:08:21.029475   873 webserver.cc:492] Webserver started at http://127.25.254.254:34851/ using document root <none> and password file <none>
I20260504 14:08:21.030021   873 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:21.030067   873 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:21.030303   873 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:21.032078   873 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "8fc8681ad972459a8b49398127c6b42c"
format_stamp: "Formatted at 2026-05-04 14:08:21 on dist-test-slave-2x32"
server_key: "50ce71fd94ada02850b6f7e5bb5764f9"
server_key_iv: "920935b62db765b204c95cf868a51419"
server_key_version: "encryptionkey@0"
I20260504 14:08:21.032534   873 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "8fc8681ad972459a8b49398127c6b42c"
format_stamp: "Formatted at 2026-05-04 14:08:21 on dist-test-slave-2x32"
server_key: "50ce71fd94ada02850b6f7e5bb5764f9"
server_key_iv: "920935b62db765b204c95cf868a51419"
server_key_version: "encryptionkey@0"
I20260504 14:08:21.036085   873 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.002s	sys 0.003s
I20260504 14:08:21.038435   888 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:21.039685   873 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.001s
I20260504 14:08:21.039798   873 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "8fc8681ad972459a8b49398127c6b42c"
format_stamp: "Formatted at 2026-05-04 14:08:21 on dist-test-slave-2x32"
server_key: "50ce71fd94ada02850b6f7e5bb5764f9"
server_key_iv: "920935b62db765b204c95cf868a51419"
server_key_version: "encryptionkey@0"
I20260504 14:08:21.039903   873 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:21.065270   873 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:21.068408   873 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:21.068585   873 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:21.076495   873 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:41049
I20260504 14:08:21.076516   940 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:41049 every 8 connection(s)
I20260504 14:08:21.077612   873 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:08:21.080700   941 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:21.082213 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 873
I20260504 14:08:21.082357 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:08:21.082633 26619 external_mini_cluster.cc:1468] Setting key 7ae45bd7be878a027a9cddcf917d4ed3
I20260504 14:08:21.087401   941 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c: Bootstrap starting.
I20260504 14:08:21.089844   941 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c: Neither blocks nor log segments found. Creating new log.
May 04 14:08:21 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903700, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM
I20260504 14:08:21.090626   941 log.cc:826] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:21.092995   941 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c: No bootstrap required, opened a new log
I20260504 14:08:21.095835   944 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:21.084021 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:51698 (local address 127.25.254.254:41049)
0504 14:08:21.084445 (+   424us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:21.084457 (+    12us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:21.084497 (+    40us) server_negotiation.cc:408] Connection header received
0504 14:08:21.085186 (+   689us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:21.085214 (+    28us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:21.085592 (+   378us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:21.085966 (+   374us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:21.086917 (+   951us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:21.088054 (+  1137us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:21.088718 (+   664us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:21.088990 (+   272us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:21.091230 (+  2240us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:21.091255 (+    25us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:21.091274 (+    19us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:21.091310 (+    36us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:21.093262 (+  1952us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:21.093782 (+   520us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:21.093787 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:21.093792 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:21.093861 (+    69us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:21.094129 (+   268us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:21.094132 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:21.094133 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:21.094493 (+   360us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:21.094632 (+   139us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:21.094950 (+   318us) server_negotiation.cc:300] Negotiation successful
0504 14:08:21.095194 (+   244us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":277,"thread_start_us":116,"threads_started":1}
I20260504 14:08:21.096179   941 raft_consensus.cc:359] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } }
I20260504 14:08:21.096361   941 raft_consensus.cc:385] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:21.096426   941 raft_consensus.cc:740] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 8fc8681ad972459a8b49398127c6b42c, State: Initialized, Role: FOLLOWER
I20260504 14:08:21.096907   941 consensus_queue.cc:260] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } }
I20260504 14:08:21.097057   941 raft_consensus.cc:399] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:08:21.097116   941 raft_consensus.cc:493] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:08:21.097204   941 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:21.098570   941 raft_consensus.cc:515] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } }
I20260504 14:08:21.098891   941 leader_election.cc:304] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 8fc8681ad972459a8b49398127c6b42c; no voters: 
I20260504 14:08:21.099153   941 leader_election.cc:290] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:08:21.099350   946 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:08:21.099676   946 raft_consensus.cc:697] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [term 1 LEADER]: Becoming Leader. State: Replica: 8fc8681ad972459a8b49398127c6b42c, State: Running, Role: LEADER
I20260504 14:08:21.099972   946 consensus_queue.cc:237] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } }
I20260504 14:08:21.100319   941 sys_catalog.cc:565] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:08:21.101495   948 sys_catalog.cc:455] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: SysCatalogTable state changed. Reason: New leader 8fc8681ad972459a8b49398127c6b42c. Latest consensus state: current_term: 1 leader_uuid: "8fc8681ad972459a8b49398127c6b42c" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } }
I20260504 14:08:21.101651   948 sys_catalog.cc:458] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:21.101943   947 sys_catalog.cc:455] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "8fc8681ad972459a8b49398127c6b42c" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } }
I20260504 14:08:21.102026   947 sys_catalog.cc:458] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:21.102011   955 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:08:21.105186   955 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:08:21.110947   955 catalog_manager.cc:1357] Generated new cluster ID: c0935c8e513349598af8d1e724bba87c
I20260504 14:08:21.111055   955 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:08:21.118122   955 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:08:21.119010   955 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:08:21.133090   955 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c: Generated new TSK 0
I20260504 14:08:21.133774   955 catalog_manager.cc:1524] Initializing in-progress tserver states...
WARNING: no policy specified for oryx/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "oryx/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal oryx/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal oryx/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
WARNING: no policy specified for HTTP/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
I20260504 14:08:21.198879 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:0
--local_ip_for_outbound_sockets=127.25.254.193
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:41049
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.193
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:37991
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:21.311704   969 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:21.311946   969 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:21.312053   969 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:21.315784   969 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:21.315860   969 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:21.315979   969 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:08:21.320422   969 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:37991
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.193
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:41049
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.969
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:21.321533   969 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:21.322468   969 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:21.329047   977 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:21.329051   975 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:21.329051   974 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:21.329424   969 server_base.cc:1061] running on GCE node
I20260504 14:08:21.329895   969 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:21.330536   969 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:21.331713   969 hybrid_clock.cc:648] HybridClock initialized: now 1777903701331684 us; error 40 us; skew 500 ppm
May 04 14:08:21 dist-test-slave-2x32 krb5kdc[857](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903701, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.193@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:21.334656   969 init.cc:377] Logged in from keytab as oryx/127.25.254.193@KRBTEST.COM (short username oryx)
I20260504 14:08:21.335803   969 webserver.cc:492] Webserver started at http://127.25.254.193:42399/ using document root <none> and password file <none>
I20260504 14:08:21.336391   969 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:21.336443   969 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:21.336670   969 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:21.338495   969 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/data/instance:
uuid: "3162d5c41c52485598abe7a0a0eec507"
format_stamp: "Formatted at 2026-05-04 14:08:21 on dist-test-slave-2x32"
server_key: "b30318173ec89571540a97b067bf3d12"
server_key_iv: "6108ebe4d0b67d73b48667b516f82339"
server_key_version: "encryptionkey@0"
I20260504 14:08:21.339035   969 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance:
uuid: "3162d5c41c52485598abe7a0a0eec507"
format_stamp: "Formatted at 2026-05-04 14:08:21 on dist-test-slave-2x32"
server_key: "b30318173ec89571540a97b067bf3d12"
server_key_iv: "6108ebe4d0b67d73b48667b516f82339"
server_key_version: "encryptionkey@0"
I20260504 14:08:21.342677   969 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.003s	sys 0.000s
I20260504 14:08:21.345147   984 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:21.346395   969 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:08:21.346529   969 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "3162d5c41c52485598abe7a0a0eec507"
format_stamp: "Formatted at 2026-05-04 14:08:21 on dist-test-slave-2x32"
server_key: "b30318173ec89571540a97b067bf3d12"
server_key_iv: "6108ebe4d0b67d73b48667b516f82339"
server_key_version: "encryptionkey@0"
I20260504 14:08:21.346698   969 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:21.360349   969 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:21.363806   969 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:21.364033   969 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:21.364666   969 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:21.365679   969 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:21.365731   969 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:21.365769   969 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:21.365829   969 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:21.376555   969 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:46541
I20260504 14:08:21.376573  1097 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:46541 every 8 connection(s)
I20260504 14:08:21.377671   969 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
I20260504 14:08:21.385418 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 969
I20260504 14:08:21.385582 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance
I20260504 14:08:21.385856 26619 external_mini_cluster.cc:1468] Setting key 9929323d14e2bf5b7e20bd9a4d951738
May 04 14:08:21 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903701, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM
I20260504 14:08:21.392339   944 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:21.379913 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:60517 (local address 127.25.254.254:41049)
0504 14:08:21.380084 (+   171us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:21.380088 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:21.380996 (+   908us) server_negotiation.cc:408] Connection header received
0504 14:08:21.381927 (+   931us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:21.381932 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:21.382000 (+    68us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:21.382102 (+   102us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:21.383365 (+  1263us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:21.383892 (+   527us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:21.384559 (+   667us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:21.384750 (+   191us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:21.388076 (+  3326us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:21.388101 (+    25us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:21.388108 (+     7us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:21.388146 (+    38us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:21.389883 (+  1737us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:21.390540 (+   657us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:21.390547 (+     7us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:21.390552 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:21.390613 (+    61us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:21.390976 (+   363us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:21.390982 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:21.390986 (+     4us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:21.391193 (+   207us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:21.391301 (+   108us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:21.391955 (+   654us) server_negotiation.cc:300] Negotiation successful
0504 14:08:21.392103 (+   148us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":71}
I20260504 14:08:21.392830  1100 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:21.380277 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.193:60517)
0504 14:08:21.380816 (+   539us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:21.380853 (+    37us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:21.381682 (+   829us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:21.382305 (+   623us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:21.382314 (+     9us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:21.382746 (+   432us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:21.383209 (+   463us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:21.383220 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:21.384023 (+   803us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:21.384027 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:21.384427 (+   400us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:21.384433 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:21.384678 (+   245us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:21.385937 (+  1259us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:21.385962 (+    25us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:21.387898 (+  1936us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:21.390023 (+  2125us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:21.390032 (+     9us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:21.390047 (+    15us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:21.390425 (+   378us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:21.390746 (+   321us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:21.390749 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:21.390751 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:21.390850 (+    99us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:21.391309 (+   459us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:21.391314 (+     5us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:21.391583 (+   269us) client_negotiation.cc:770] Sending connection context
0504 14:08:21.391830 (+   247us) client_negotiation.cc:241] Negotiation successful
0504 14:08:21.392047 (+   217us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":301,"thread_start_us":123,"threads_started":1}
I20260504 14:08:21.394101  1098 heartbeater.cc:344] Connected to a master server at 127.25.254.254:41049
I20260504 14:08:21.394408  1098 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:21.395073  1098 heartbeater.cc:507] Master 127.25.254.254:41049 requested a full tablet report, sending...
WARNING: no policy specified for oryx/127.25.254.194@KRBTEST.COM; defaulting to no policy
I20260504 14:08:21.396725   905 ts_manager.cc:194] Registered new tserver with Master: 3162d5c41c52485598abe7a0a0eec507 (127.25.254.193:46541)
I20260504 14:08:21.398103   905 master_service.cc:502] Signed X509 certificate for tserver {username='oryx', principal='oryx/127.25.254.193@KRBTEST.COM'} at 127.25.254.193:60517
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "oryx/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal oryx/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal oryx/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
WARNING: no policy specified for HTTP/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
I20260504 14:08:21.443238 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:0
--local_ip_for_outbound_sockets=127.25.254.194
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:41049
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.194
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:37991
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:21.550880  1105 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:21.551190  1105 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:21.551307  1105 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:21.555105  1105 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:21.555245  1105 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:21.555367  1105 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:08:21.560286  1105 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:37991
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.194
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:41049
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.1105
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:21.561530  1105 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:21.562455  1105 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:21.569283  1113 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:21.569304  1111 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:21.569334  1110 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:21.569943  1105 server_base.cc:1061] running on GCE node
I20260504 14:08:21.570408  1105 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:21.571027  1105 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:21.572227  1105 hybrid_clock.cc:648] HybridClock initialized: now 1777903701572199 us; error 40 us; skew 500 ppm
May 04 14:08:21 dist-test-slave-2x32 krb5kdc[857](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903701, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.194@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:21.575531  1105 init.cc:377] Logged in from keytab as oryx/127.25.254.194@KRBTEST.COM (short username oryx)
I20260504 14:08:21.576673  1105 webserver.cc:492] Webserver started at http://127.25.254.194:45723/ using document root <none> and password file <none>
I20260504 14:08:21.577260  1105 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:21.577338  1105 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:21.577548  1105 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:21.579286  1105 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/data/instance:
uuid: "455287b19bad4de4a0b2b1c878c6b1d0"
format_stamp: "Formatted at 2026-05-04 14:08:21 on dist-test-slave-2x32"
server_key: "132a133de3102d7ab024268479044fb8"
server_key_iv: "da992cdf2236ad34ac26c572d1f7c401"
server_key_version: "encryptionkey@0"
I20260504 14:08:21.579857  1105 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance:
uuid: "455287b19bad4de4a0b2b1c878c6b1d0"
format_stamp: "Formatted at 2026-05-04 14:08:21 on dist-test-slave-2x32"
server_key: "132a133de3102d7ab024268479044fb8"
server_key_iv: "da992cdf2236ad34ac26c572d1f7c401"
server_key_version: "encryptionkey@0"
I20260504 14:08:21.583236  1105 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.001s	sys 0.003s
I20260504 14:08:21.585669  1120 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:21.586969  1105 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.001s
I20260504 14:08:21.587132  1105 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "455287b19bad4de4a0b2b1c878c6b1d0"
format_stamp: "Formatted at 2026-05-04 14:08:21 on dist-test-slave-2x32"
server_key: "132a133de3102d7ab024268479044fb8"
server_key_iv: "da992cdf2236ad34ac26c572d1f7c401"
server_key_version: "encryptionkey@0"
I20260504 14:08:21.587255  1105 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:21.608588  1105 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:21.611922  1105 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:21.612279  1105 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:21.613005  1105 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:21.614224  1105 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:21.614306  1105 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:21.614385  1105 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:21.614420  1105 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:21.625926  1105 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:44781
I20260504 14:08:21.625952  1233 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:44781 every 8 connection(s)
I20260504 14:08:21.627046  1105 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
I20260504 14:08:21.630556 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 1105
I20260504 14:08:21.630648 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance
I20260504 14:08:21.631045 26619 external_mini_cluster.cc:1468] Setting key 39003917c93a07509a0e0cae532e6592
May 04 14:08:21 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903701, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM
WARNING: no policy specified for oryx/127.25.254.195@KRBTEST.COM; defaulting to no policy
I20260504 14:08:21.642237   944 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:21.629184 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:39947 (local address 127.25.254.254:41049)
0504 14:08:21.629334 (+   150us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:21.629338 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:21.630079 (+   741us) server_negotiation.cc:408] Connection header received
0504 14:08:21.631147 (+  1068us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:21.631152 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:21.631217 (+    65us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:21.631327 (+   110us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:21.632915 (+  1588us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:21.633497 (+   582us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:21.634405 (+   908us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:21.634645 (+   240us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:21.637451 (+  2806us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:21.637480 (+    29us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:21.637486 (+     6us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:21.637525 (+    39us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:21.639400 (+  1875us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:21.640172 (+   772us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:21.640177 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:21.640179 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:21.640240 (+    61us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:21.640773 (+   533us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:21.640776 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:21.640777 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:21.640974 (+   197us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:21.641064 (+    90us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:21.641627 (+   563us) server_negotiation.cc:300] Negotiation successful
0504 14:08:21.642029 (+   402us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":56}
I20260504 14:08:21.642889  1236 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:21.629407 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.194:39947)
0504 14:08:21.629905 (+   498us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:21.629941 (+    36us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:21.630929 (+   988us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:21.631494 (+   565us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:21.631502 (+     8us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:21.632055 (+   553us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:21.632750 (+   695us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:21.632768 (+    18us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:21.633650 (+   882us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:21.633657 (+     7us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:21.634289 (+   632us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:21.634297 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:21.634465 (+   168us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:21.635668 (+  1203us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:21.635688 (+    20us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:21.637258 (+  1570us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:21.639751 (+  2493us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:21.639757 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:21.639769 (+    12us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:21.640046 (+   277us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:21.640386 (+   340us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:21.640390 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:21.640392 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:21.640633 (+   241us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:21.641118 (+   485us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:21.641123 (+     5us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:21.641371 (+   248us) client_negotiation.cc:770] Sending connection context
0504 14:08:21.641582 (+   211us) client_negotiation.cc:241] Negotiation successful
0504 14:08:21.641849 (+   267us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":311,"thread_start_us":100,"threads_started":1}
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "oryx/127.25.254.195@KRBTEST.COM" created.
I20260504 14:08:21.644352  1234 heartbeater.cc:344] Connected to a master server at 127.25.254.254:41049
I20260504 14:08:21.644721  1234 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:21.645372  1234 heartbeater.cc:507] Master 127.25.254.254:41049 requested a full tablet report, sending...
I20260504 14:08:21.646876   905 ts_manager.cc:194] Registered new tserver with Master: 455287b19bad4de4a0b2b1c878c6b1d0 (127.25.254.194:44781)
I20260504 14:08:21.647555   905 master_service.cc:502] Signed X509 certificate for tserver {username='oryx', principal='oryx/127.25.254.194@KRBTEST.COM'} at 127.25.254.194:39947
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal oryx/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal oryx/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
WARNING: no policy specified for HTTP/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
I20260504 14:08:21.694747 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:0
--local_ip_for_outbound_sockets=127.25.254.195
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:41049
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.195
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:37991
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:21.808040  1241 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:21.808372  1241 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:21.808467  1241 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:21.812177  1241 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:21.812290  1241 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:21.812408  1241 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:08:21.817206  1241 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:37991
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.195
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:41049
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.1241
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:21.818467  1241 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:21.819339  1241 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:21.825893  1247 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:21.825883  1249 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:21.825893  1246 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:21.826311  1241 server_base.cc:1061] running on GCE node
I20260504 14:08:21.826738  1241 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:21.827327  1241 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:21.828725  1241 hybrid_clock.cc:648] HybridClock initialized: now 1777903701828714 us; error 67 us; skew 500 ppm
May 04 14:08:21 dist-test-slave-2x32 krb5kdc[857](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903701, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.195@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:21.831961  1241 init.cc:377] Logged in from keytab as oryx/127.25.254.195@KRBTEST.COM (short username oryx)
I20260504 14:08:21.833112  1241 webserver.cc:492] Webserver started at http://127.25.254.195:36455/ using document root <none> and password file <none>
I20260504 14:08:21.833735  1241 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:21.833815  1241 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:21.834034  1241 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:21.835881  1241 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/data/instance:
uuid: "e7715d2595ab45af9b8b972e34e95e10"
format_stamp: "Formatted at 2026-05-04 14:08:21 on dist-test-slave-2x32"
server_key: "d42f70ebf68a15e9a40c1dc4a50a1c61"
server_key_iv: "ab7bcfb17ab0e99ea446a35810d43aff"
server_key_version: "encryptionkey@0"
I20260504 14:08:21.836400  1241 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance:
uuid: "e7715d2595ab45af9b8b972e34e95e10"
format_stamp: "Formatted at 2026-05-04 14:08:21 on dist-test-slave-2x32"
server_key: "d42f70ebf68a15e9a40c1dc4a50a1c61"
server_key_iv: "ab7bcfb17ab0e99ea446a35810d43aff"
server_key_version: "encryptionkey@0"
I20260504 14:08:21.840086  1241 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.005s	sys 0.000s
I20260504 14:08:21.842479  1256 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:21.843652  1241 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.001s
I20260504 14:08:21.843816  1241 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "e7715d2595ab45af9b8b972e34e95e10"
format_stamp: "Formatted at 2026-05-04 14:08:21 on dist-test-slave-2x32"
server_key: "d42f70ebf68a15e9a40c1dc4a50a1c61"
server_key_iv: "ab7bcfb17ab0e99ea446a35810d43aff"
server_key_version: "encryptionkey@0"
I20260504 14:08:21.843940  1241 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:21.860201  1241 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:21.863317  1241 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:21.863541  1241 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:21.864181  1241 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:21.865115  1241 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:21.865188  1241 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:21.865262  1241 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:21.865304  1241 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:21.875141  1241 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:37043
I20260504 14:08:21.875177  1369 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:37043 every 8 connection(s)
I20260504 14:08:21.876106  1241 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
I20260504 14:08:21.881568 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 1241
I20260504 14:08:21.881670 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance
I20260504 14:08:21.881968 26619 external_mini_cluster.cc:1468] Setting key fe055ac1dca03fc38e2637ee8f20364b
May 04 14:08:21 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903701, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM
I20260504 14:08:21.889209   944 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:21.877963 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:53509 (local address 127.25.254.254:41049)
0504 14:08:21.878109 (+   146us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:21.878113 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:21.878930 (+   817us) server_negotiation.cc:408] Connection header received
0504 14:08:21.879806 (+   876us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:21.879810 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:21.879874 (+    64us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:21.879986 (+   112us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:21.881164 (+  1178us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:21.881745 (+   581us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:21.882517 (+   772us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:21.882842 (+   325us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:21.885701 (+  2859us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:21.885726 (+    25us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:21.885728 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:21.885757 (+    29us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:21.887245 (+  1488us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:21.887792 (+   547us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:21.887796 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:21.887797 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:21.887847 (+    50us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:21.888145 (+   298us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:21.888149 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:21.888151 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:21.888312 (+   161us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:21.888422 (+   110us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:21.888911 (+   489us) server_negotiation.cc:300] Negotiation successful
0504 14:08:21.889040 (+   129us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":53}
I20260504 14:08:21.889837  1372 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:21.878312 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.195:53509)
0504 14:08:21.878742 (+   430us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:21.878775 (+    33us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:21.879596 (+   821us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:21.880149 (+   553us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:21.880157 (+     8us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:21.880541 (+   384us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:21.881004 (+   463us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:21.881014 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:21.881887 (+   873us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:21.881894 (+     7us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:21.882383 (+   489us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:21.882391 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:21.882609 (+   218us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:21.883918 (+  1309us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:21.883937 (+    19us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:21.885537 (+  1600us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:21.887385 (+  1848us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:21.887394 (+     9us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:21.887410 (+    16us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:21.887661 (+   251us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:21.887952 (+   291us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:21.887956 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:21.887957 (+     1us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:21.888049 (+    92us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:21.888422 (+   373us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:21.888429 (+     7us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:21.888665 (+   236us) client_negotiation.cc:770] Sending connection context
0504 14:08:21.888885 (+   220us) client_negotiation.cc:241] Negotiation successful
0504 14:08:21.889121 (+   236us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":255,"thread_start_us":90,"threads_started":1}
I20260504 14:08:21.891047  1370 heartbeater.cc:344] Connected to a master server at 127.25.254.254:41049
I20260504 14:08:21.891287  1370 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:21.891846  1370 heartbeater.cc:507] Master 127.25.254.254:41049 requested a full tablet report, sending...
I20260504 14:08:21.892949   905 ts_manager.cc:194] Registered new tserver with Master: e7715d2595ab45af9b8b972e34e95e10 (127.25.254.195:37043)
I20260504 14:08:21.893529   905 master_service.cc:502] Signed X509 certificate for tserver {username='oryx', principal='oryx/127.25.254.195@KRBTEST.COM'} at 127.25.254.195:53509
I20260504 14:08:21.896361 26619 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
WARNING: no policy specified for oryx/127.25.254.253@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "oryx/127.25.254.253@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal oryx/127.25.254.253 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal oryx/127.25.254.253 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
WARNING: no policy specified for HTTP/127.25.254.253@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.253@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.253 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal HTTP/127.25.254.253 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
I20260504 14:08:21.955315 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.253:42377
--webserver_interface=127.25.254.253
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.253
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:37991
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:41049,127.25.254.253:42377
--encrypt_data_at_rest=true
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:22.062726  1377 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:22.062981  1377 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:22.063035  1377 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:22.066514  1377 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:08:22.066596  1377 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:22.066627  1377 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:22.066648  1377 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:08:22.066664  1377 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:08:22.071107  1377 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:37991
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:41049,127.25.254.253:42377
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.253
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.253:42377
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/data/info.pb
--webserver_interface=127.25.254.253
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.1377
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
W20260504 14:08:22.071367  1377 master_options.cc:55] Only 2 masters are specified by master_addresses_flag ('127.25.254.254:41049,127.25.254.253:42377'), but minimum of 3 are required to tolerate failures of any one master. It is recommended to use at least 3 masters.
I20260504 14:08:22.072291  1377 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:22.073215  1377 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:22.079308  1383 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:22.079309  1385 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:22.079308  1382 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:22.079969  1377 server_base.cc:1061] running on GCE node
I20260504 14:08:22.080499  1377 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:22.081498  1377 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:22.082711  1377 hybrid_clock.cc:648] HybridClock initialized: now 1777903702082698 us; error 29 us; skew 500 ppm
May 04 14:08:22 dist-test-slave-2x32 krb5kdc[857](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903702, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.253@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:22.085909  1377 init.cc:377] Logged in from keytab as oryx/127.25.254.253@KRBTEST.COM (short username oryx)
I20260504 14:08:22.087200  1377 webserver.cc:492] Webserver started at http://127.25.254.253:40579/ using document root <none> and password file <none>
I20260504 14:08:22.087777  1377 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:22.087859  1377 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:22.088080  1377 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:22.089769  1377 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/data/instance:
uuid: "5f6047ff1b18475e986323ecedb0f8b5"
format_stamp: "Formatted at 2026-05-04 14:08:22 on dist-test-slave-2x32"
server_key: "80409fd7103efcc75a24193595464447"
server_key_iv: "d6ff432579d33739888bbda93ab345f9"
server_key_version: "encryptionkey@0"
I20260504 14:08:22.090346  1377 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/wal/instance:
uuid: "5f6047ff1b18475e986323ecedb0f8b5"
format_stamp: "Formatted at 2026-05-04 14:08:22 on dist-test-slave-2x32"
server_key: "80409fd7103efcc75a24193595464447"
server_key_iv: "d6ff432579d33739888bbda93ab345f9"
server_key_version: "encryptionkey@0"
I20260504 14:08:22.093801  1377 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.002s	sys 0.001s
I20260504 14:08:22.096184  1392 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:22.097226  1377 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:08:22.097369  1377 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/wal
uuid: "5f6047ff1b18475e986323ecedb0f8b5"
format_stamp: "Formatted at 2026-05-04 14:08:22 on dist-test-slave-2x32"
server_key: "80409fd7103efcc75a24193595464447"
server_key_iv: "d6ff432579d33739888bbda93ab345f9"
server_key_version: "encryptionkey@0"
I20260504 14:08:22.097496  1377 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:22.139503  1377 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:22.142939  1377 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:22.143165  1377 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:22.152194  1377 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.253:42377
I20260504 14:08:22.152204  1444 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.253:42377 every 8 connection(s)
I20260504 14:08:22.153391  1377 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/data/info.pb
I20260504 14:08:22.156380  1445 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:22.160079  1445 sys_catalog.cc:422] member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } has no permanent_uuid. Determining permanent_uuid...
I20260504 14:08:22.162237 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 1377
May 04 14:08:22 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903702, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.253@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM
May 04 14:08:22 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903700, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for oryx/127.25.254.253@KRBTEST.COM
I20260504 14:08:22.173064   944 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.161853 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:51708 (local address 127.25.254.254:41049)
0504 14:08:22.162069 (+   216us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:22.162076 (+     7us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:22.162882 (+   806us) server_negotiation.cc:408] Connection header received
0504 14:08:22.163778 (+   896us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:22.163783 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:22.163846 (+    63us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:22.163958 (+   112us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:22.164851 (+   893us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.165363 (+   512us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.166009 (+   646us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.166224 (+   215us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.168662 (+  2438us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:22.168684 (+    22us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:22.168687 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:22.168716 (+    29us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:22.170610 (+  1894us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.171219 (+   609us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.171222 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.171223 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.171272 (+    49us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.171629 (+   357us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.171633 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.171635 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.171822 (+   187us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:22.171924 (+   102us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:22.172656 (+   732us) server_negotiation.cc:300] Negotiation successful
0504 14:08:22.172817 (+   161us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":93}
I20260504 14:08:22.173487  1447 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.162106 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.0.0.1:51708)
0504 14:08:22.162698 (+   592us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.162742 (+    44us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:22.163572 (+   830us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:22.164008 (+   436us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:22.164019 (+    11us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:22.164394 (+   375us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:22.164702 (+   308us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.164712 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.165474 (+   762us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.165480 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:22.165888 (+   408us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.165894 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.166115 (+   221us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.166888 (+   773us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:22.166914 (+    26us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:22.168496 (+  1582us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:22.170737 (+  2241us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.170749 (+    12us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.170760 (+    11us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.171080 (+   320us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.171381 (+   301us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.171384 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.171386 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.171504 (+   118us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.171937 (+   433us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:22.171943 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:22.172237 (+   294us) client_negotiation.cc:770] Sending connection context
0504 14:08:22.172464 (+   227us) client_negotiation.cc:241] Negotiation successful
0504 14:08:22.172702 (+   238us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":395,"thread_start_us":174,"threads_started":1}
I20260504 14:08:22.173861  1449 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.163215 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:42730 (local address 127.25.254.253:42377)
0504 14:08:22.163634 (+   419us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:22.163640 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:22.163670 (+    30us) server_negotiation.cc:408] Connection header received
0504 14:08:22.163786 (+   116us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:22.163790 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:22.163950 (+   160us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:22.164122 (+   172us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:22.165095 (+   973us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.165889 (+   794us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.166790 (+   901us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.167008 (+   218us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.169373 (+  2365us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:22.169399 (+    26us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:22.169407 (+     8us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:22.169428 (+    21us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:22.171694 (+  2266us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.172225 (+   531us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.172231 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.172232 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.172280 (+    48us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.172591 (+   311us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.172596 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.172598 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.172853 (+   255us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:22.173058 (+   205us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:22.173572 (+   514us) server_negotiation.cc:300] Negotiation successful
0504 14:08:22.173705 (+   133us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":273,"thread_start_us":157,"threads_started":1}
I20260504 14:08:22.175295  1445 sys_catalog.cc:422] member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } has no permanent_uuid. Determining permanent_uuid...
May 04 14:08:22 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903702, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.253@KRBTEST.COM for oryx/127.25.254.253@KRBTEST.COM
I20260504 14:08:22.183187  1447 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.175852 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.253:42377 (local address 127.0.0.1:42732)
0504 14:08:22.176078 (+   226us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.176090 (+    12us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:22.176164 (+    74us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:22.176407 (+   243us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:22.176410 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:22.176550 (+   140us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:22.176739 (+   189us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.176744 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.177434 (+   690us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.177438 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:22.177834 (+   396us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.177850 (+    16us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.177965 (+   115us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.178562 (+   597us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:22.178576 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:22.179989 (+  1413us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:22.181644 (+  1655us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.181648 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.181650 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.181829 (+   179us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.182239 (+   410us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.182242 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.182244 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.182292 (+    48us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.182725 (+   433us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:22.182728 (+     3us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:22.182798 (+    70us) client_negotiation.cc:770] Sending connection context
0504 14:08:22.182894 (+    96us) client_negotiation.cc:241] Negotiation successful
0504 14:08:22.183012 (+   118us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":165}
I20260504 14:08:22.183194  1449 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.175971 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:42732 (local address 127.25.254.253:42377)
0504 14:08:22.176084 (+   113us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:22.176088 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:22.176145 (+    57us) server_negotiation.cc:408] Connection header received
0504 14:08:22.176249 (+   104us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:22.176252 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:22.176289 (+    37us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:22.176354 (+    65us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:22.176854 (+   500us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.177312 (+   458us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.177971 (+   659us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.178132 (+   161us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.180123 (+  1991us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:22.180138 (+    15us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:22.180140 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:22.180161 (+    21us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:22.181521 (+  1360us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.181957 (+   436us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.181960 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.181961 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.181998 (+    37us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.182380 (+   382us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.182383 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.182384 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.182519 (+   135us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:22.182598 (+    79us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:22.182974 (+   376us) server_negotiation.cc:300] Negotiation successful
0504 14:08:22.183066 (+    92us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":50}
I20260504 14:08:22.186451  1445 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: Bootstrap starting.
I20260504 14:08:22.189016  1445 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:22.189862  1445 log.cc:826] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:22.192121  1445 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: No bootstrap required, opened a new log
I20260504 14:08:22.194762  1445 raft_consensus.cc:359] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } }
I20260504 14:08:22.194973  1445 raft_consensus.cc:385] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:22.195070  1445 raft_consensus.cc:740] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5f6047ff1b18475e986323ecedb0f8b5, State: Initialized, Role: FOLLOWER
I20260504 14:08:22.195492  1445 consensus_queue.cc:260] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } }
I20260504 14:08:22.196003  1451 sys_catalog.cc:455] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 0 committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } } }
I20260504 14:08:22.196115  1451 sys_catalog.cc:458] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: This master's current role is: FOLLOWER
I20260504 14:08:22.196681  1445 sys_catalog.cc:565] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: configured and running, proceeding with master startup.
W20260504 14:08:22.201566  1462 catalog_manager.cc:1568] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: loading cluster ID for follower catalog manager: Not found: cluster ID entry not found
W20260504 14:08:22.201668  1462 catalog_manager.cc:883] Not found: cluster ID entry not found: failed to prepare follower catalog manager, will retry
I20260504 14:08:22.203940  1377 master_runner.cc:428] Detected that this master 5f6047ff1b18475e986323ecedb0f8b5 is joining an existing cluster
I20260504 14:08:22.204030  1377 master_runner.cc:432] Initiating AddMaster RPC to add 127.25.254.253:42377
I20260504 14:08:22.213454   944 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.206217 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:51722 (local address 127.25.254.254:41049)
0504 14:08:22.206354 (+   137us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:22.206358 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:22.206612 (+   254us) server_negotiation.cc:408] Connection header received
0504 14:08:22.206818 (+   206us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:22.206822 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:22.206887 (+    65us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:22.206944 (+    57us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:22.207644 (+   700us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.208190 (+   546us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.209105 (+   915us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.209313 (+   208us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.210404 (+  1091us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:22.210422 (+    18us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:22.210424 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:22.210450 (+    26us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:22.211947 (+  1497us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.212407 (+   460us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.212409 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.212411 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.212455 (+    44us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.212702 (+   247us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.212704 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.212705 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.212853 (+   148us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:22.212949 (+    96us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:22.213237 (+   288us) server_negotiation.cc:300] Negotiation successful
0504 14:08:22.213327 (+    90us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":37}
I20260504 14:08:22.213536  1468 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.206093 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.0.0.1:51722)
0504 14:08:22.206505 (+   412us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.206522 (+    17us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:22.206696 (+   174us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:22.207075 (+   379us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:22.207077 (+     2us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:22.207266 (+   189us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:22.207501 (+   235us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.207506 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.208328 (+   822us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.208332 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:22.208974 (+   642us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.208983 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.209104 (+   121us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.209780 (+   676us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:22.209821 (+    41us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:22.210279 (+   458us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:22.212070 (+  1791us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.212075 (+     5us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.212077 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.212308 (+   231us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.212545 (+   237us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.212549 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.212552 (+     3us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.212612 (+    60us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.212978 (+   366us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:22.212982 (+     4us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:22.213073 (+    91us) client_negotiation.cc:770] Sending connection context
0504 14:08:22.213188 (+   115us) client_negotiation.cc:241] Negotiation successful
0504 14:08:22.213322 (+   134us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":313,"thread_start_us":285,"threads_started":1}
May 04 14:08:22 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903701, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.254@KRBTEST.COM for oryx/127.25.254.253@KRBTEST.COM
I20260504 14:08:22.230792  1470 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.217995 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.253:42377 (local address 127.0.0.1:42738)
0504 14:08:22.218398 (+   403us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.218430 (+    32us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:22.218547 (+   117us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:22.219162 (+   615us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:22.219166 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:22.219534 (+   368us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:22.219760 (+   226us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.219768 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.220892 (+  1124us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.220895 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:22.221250 (+   355us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.221256 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.221351 (+    95us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.221965 (+   614us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:22.221980 (+    15us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:22.223623 (+  1643us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:22.228935 (+  5312us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.228948 (+    13us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.228954 (+     6us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.229311 (+   357us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.229593 (+   282us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.229596 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.229598 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.229658 (+    60us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.230105 (+   447us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:22.230111 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:22.230279 (+   168us) client_negotiation.cc:770] Sending connection context
0504 14:08:22.230431 (+   152us) client_negotiation.cc:241] Negotiation successful
0504 14:08:22.230595 (+   164us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":245,"thread_start_us":111,"threads_started":1}
I20260504 14:08:22.231527  1471 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.218066 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:42738 (local address 127.25.254.253:42377)
0504 14:08:22.218794 (+   728us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:22.218801 (+     7us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:22.218816 (+    15us) server_negotiation.cc:408] Connection header received
0504 14:08:22.218912 (+    96us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:22.218916 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:22.218982 (+    66us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:22.219079 (+    97us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:22.219892 (+   813us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.220742 (+   850us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.221358 (+   616us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.221572 (+   214us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.223771 (+  2199us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:22.223803 (+    32us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:22.223805 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:22.223829 (+    24us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:22.228771 (+  4942us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.229438 (+   667us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.229442 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.229444 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.229495 (+    51us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.229752 (+   257us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.229755 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.229758 (+     3us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.229948 (+   190us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:22.230040 (+    92us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:22.231227 (+  1187us) server_negotiation.cc:300] Negotiation successful
0504 14:08:22.231348 (+   121us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":627,"thread_start_us":156,"threads_started":1}
I20260504 14:08:22.231971  1469 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.206205 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.253:42377 (local address 127.0.0.1:42736)
0504 14:08:22.206635 (+   430us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.206648 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:22.206738 (+    90us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:22.206967 (+   229us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:22.206970 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:22.207720 (+   750us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:22.208021 (+   301us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.208029 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.208737 (+   708us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.208746 (+     9us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:22.209160 (+   414us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.209168 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.209540 (+   372us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.226969 (+ 17429us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:22.226988 (+    19us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:22.227498 (+   510us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:22.229989 (+  2491us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.229995 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.230001 (+     6us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.230275 (+   274us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.230551 (+   276us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.230558 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.230562 (+     4us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.230620 (+    58us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.231416 (+   796us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:22.231423 (+     7us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:22.231514 (+    91us) client_negotiation.cc:770] Sending connection context
0504 14:08:22.231644 (+   130us) client_negotiation.cc:241] Negotiation successful
0504 14:08:22.231777 (+   133us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":338,"thread_start_us":132,"threads_started":1}
I20260504 14:08:22.232532  1449 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.206638 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:42736 (local address 127.25.254.253:42377)
0504 14:08:22.206742 (+   104us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:22.206748 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:22.206762 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:08:22.206805 (+    43us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:22.206808 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:22.206848 (+    40us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:22.206927 (+    79us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:22.208152 (+  1225us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.208613 (+   461us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.209278 (+   665us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.209440 (+   162us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.227644 (+ 18204us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:22.227664 (+    20us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:22.227669 (+     5us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:22.227704 (+    35us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:22.229847 (+  2143us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.230391 (+   544us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.230397 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.230401 (+     4us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.230453 (+    52us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.230725 (+   272us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.230729 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.230732 (+     3us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.230943 (+   211us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:22.231038 (+    95us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:22.232255 (+  1217us) server_negotiation.cc:300] Negotiation successful
0504 14:08:22.232395 (+   140us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":39}
I20260504 14:08:22.233381   905 catalog_manager.cc:7158] Initiating ChangeConfig request to add master 127.25.254.253:42377: tablet_id: "00000000000000000000000000000000"
type: ADD_PEER
server {
  permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5"
  member_type: NON_VOTER
  last_known_addr {
    host: "127.25.254.253"
    port: 42377
  }
  attrs {
    promote: true
  }
}
dest_uuid: "8fc8681ad972459a8b49398127c6b42c"
cas_config_opid_index: -1
I20260504 14:08:22.234225   905 consensus_queue.cc:237] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 4, Committed index: 4, Last appended: 1.4, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: 5 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: NON_VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: true } }
I20260504 14:08:22.236757  1419 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 0 FOLLOWER]: Advancing to term 1
WARNING: no policy specified for oryx/127.25.254.252@KRBTEST.COM; defaulting to no policy
I20260504 14:08:22.238008  1419 raft_consensus.cc:1275] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 1 FOLLOWER]: Refusing update from remote peer 8fc8681ad972459a8b49398127c6b42c: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 5. (index mismatch)
I20260504 14:08:22.236173  1475 raft_consensus.cc:2955] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [term 1 LEADER]: Committing config change with OpId 1.5: config changed from index -1 to 5, NON_VOTER 5f6047ff1b18475e986323ecedb0f8b5 (127.25.254.253) added. New config: { opid_index: 5 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: NON_VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: true } } }
I20260504 14:08:22.238781  1474 consensus_queue.cc:1048] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [LEADER]: Connected to new peer: Peer: permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: NON_VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: true }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:08:22.241030  1475 catalog_manager.cc:7162] Successfully completed master ChangeConfig request to add master 127.25.254.253:42377
I20260504 14:08:22.241463  1377 master.cc:562] Master@127.25.254.253:42377 shutting down...
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "oryx/127.25.254.252@KRBTEST.COM" created.
I20260504 14:08:22.244159  1474 sys_catalog.cc:455] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: SysCatalogTable state changed. Reason: Config change replication complete. Latest consensus state: current_term: 1 leader_uuid: "8fc8681ad972459a8b49398127c6b42c" committed_config { opid_index: 5 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: NON_VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: true } } }
I20260504 14:08:22.244292  1474 sys_catalog.cc:458] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:22.246775  1419 raft_consensus.cc:2955] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 1 LEARNER]: Committing config change with OpId 1.5: config changed from index -1 to 5, NON_VOTER 5f6047ff1b18475e986323ecedb0f8b5 (127.25.254.253) added. New config: { opid_index: 5 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: NON_VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: true } } }
I20260504 14:08:22.249512  1476 mvcc.cc:204] Tried to move back new op lower bound from 7282293559835873280 to 7282293559706292224. Current Snapshot: MvccSnapshot[applied={T|T < 7282293559835873280}]
I20260504 14:08:22.248975  1451 sys_catalog.cc:455] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 8fc8681ad972459a8b49398127c6b42c. Latest consensus state: current_term: 1 leader_uuid: "8fc8681ad972459a8b49398127c6b42c" committed_config { opid_index: 5 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: NON_VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: true } } }
I20260504 14:08:22.250025  1451 sys_catalog.cc:458] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: This master's current role is: LEARNER
I20260504 14:08:22.250133  1451 sys_catalog.cc:455] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: SysCatalogTable state changed. Reason: Replicated consensus-only round. Latest consensus state: current_term: 1 leader_uuid: "8fc8681ad972459a8b49398127c6b42c" committed_config { opid_index: 5 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: NON_VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: true } } }
I20260504 14:08:22.250237  1451 sys_catalog.cc:458] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: This master's current role is: LEARNER
I20260504 14:08:22.250317  1451 sys_catalog.cc:455] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: SysCatalogTable state changed. Reason: Replicated consensus-only round. Latest consensus state: current_term: 1 leader_uuid: "8fc8681ad972459a8b49398127c6b42c" committed_config { opid_index: 5 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: NON_VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: true } } }
I20260504 14:08:22.250377  1451 sys_catalog.cc:458] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: This master's current role is: LEARNER
I20260504 14:08:22.253142  1377 raft_consensus.cc:2243] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 1 LEARNER]: Raft consensus shutting down.
I20260504 14:08:22.253504  1377 raft_consensus.cc:2272] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 1 LEARNER]: Raft consensus is shut down!
I20260504 14:08:22.253600  1377 tablet_replica.cc:333] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: stopping tablet replica
I20260504 14:08:22.259863  1377 master.cc:584] Master@127.25.254.253:42377 shutdown complete.
I20260504 14:08:22.259968  1377 master_runner.cc:305] Clearing existing system tablet
I20260504 14:08:22.260222  1377 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:22.260296  1377 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:22.261420  1377 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.002s
I20260504 14:08:22.262352  1486 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:22.262557  1377 fs_report.cc:389] FS layout report
--------------------
wal directory: 
metadata directory: 
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:22.262713  1377 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:08:22.262758  1377 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/wal
uuid: "5f6047ff1b18475e986323ecedb0f8b5"
format_stamp: "Formatted at 2026-05-04 14:08:22 on dist-test-slave-2x32"
server_key: "80409fd7103efcc75a24193595464447"
server_key_iv: "d6ff432579d33739888bbda93ab345f9"
server_key_version: "encryptionkey@0"
I20260504 14:08:22.264333  1377 ts_tablet_manager.cc:1916] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: Deleting tablet data with delete state TABLET_DATA_DELETED
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal oryx/127.25.254.252 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal oryx/127.25.254.252 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
I20260504 14:08:22.266337  1377 ts_tablet_manager.cc:1929] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId unknown
I20260504 14:08:22.266434  1377 log.cc:1199] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/wal/wals/00000000000000000000000000000000
I20260504 14:08:22.266757  1377 ts_tablet_manager.cc:1950] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: Deleting consensus metadata
I20260504 14:08:22.267066  1377 master_runner.cc:315] Copying system tablet from 127.25.254.254:41049
I20260504 14:08:22.268370  1377 tablet_copy_client.cc:323] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: tablet copy: Beginning tablet copy session from remote peer at address 127.25.254.254:41049
WARNING: no policy specified for HTTP/127.25.254.252@KRBTEST.COM; defaulting to no policy
I20260504 14:08:22.277571   944 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.269187 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:51730 (local address 127.25.254.254:41049)
0504 14:08:22.269330 (+   143us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:22.269335 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:22.269562 (+   227us) server_negotiation.cc:408] Connection header received
0504 14:08:22.269792 (+   230us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:22.269796 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:22.269890 (+    94us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:22.269978 (+    88us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:22.270717 (+   739us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.271230 (+   513us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.272052 (+   822us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.272252 (+   200us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.273636 (+  1384us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:22.273660 (+    24us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:22.273664 (+     4us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:22.273700 (+    36us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:22.275750 (+  2050us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.276351 (+   601us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.276356 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.276358 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.276418 (+    60us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.276701 (+   283us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.276704 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.276706 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.276874 (+   168us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:22.277027 (+   153us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:22.277293 (+   266us) server_negotiation.cc:300] Negotiation successful
0504 14:08:22.277402 (+   109us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":59}
I20260504 14:08:22.277595  1493 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.269128 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.0.0.1:51730)
0504 14:08:22.269458 (+   330us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.269470 (+    12us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:22.269661 (+   191us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:22.270110 (+   449us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:22.270113 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:22.270378 (+   265us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:22.270579 (+   201us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.270585 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.271351 (+   766us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.271356 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:22.271897 (+   541us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.271906 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.272401 (+   495us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.273016 (+   615us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:22.273038 (+    22us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:22.273473 (+   435us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:22.275888 (+  2415us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.275895 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.275897 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.276218 (+   321us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.276521 (+   303us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.276526 (+     5us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.276528 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.276586 (+    58us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.277023 (+   437us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:22.277027 (+     4us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:22.277124 (+    97us) client_negotiation.cc:770] Sending connection context
0504 14:08:22.277237 (+   113us) client_negotiation.cc:241] Negotiation successful
0504 14:08:22.277393 (+   156us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":264,"thread_start_us":105,"threads_started":1}
I20260504 14:08:22.278270   925 tablet_copy_service.cc:140] P 8fc8681ad972459a8b49398127c6b42c: Received BeginTabletCopySession request for tablet 00000000000000000000000000000000 from peer 5f6047ff1b18475e986323ecedb0f8b5 ({username='oryx', principal='oryx/127.25.254.253@KRBTEST.COM'} at 127.0.0.1:51730)
I20260504 14:08:22.278458   925 tablet_copy_service.cc:161] P 8fc8681ad972459a8b49398127c6b42c: Beginning new tablet copy session on tablet 00000000000000000000000000000000 from peer 5f6047ff1b18475e986323ecedb0f8b5 at {username='oryx', principal='oryx/127.25.254.253@KRBTEST.COM'} at 127.0.0.1:51730: session id = 5f6047ff1b18475e986323ecedb0f8b5-00000000000000000000000000000000
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.252@KRBTEST.COM" created.
I20260504 14:08:22.279843   925 tablet_copy_source_session.cc:215] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c: Tablet Copy: opened 0 blocks and 1 log segments
I20260504 14:08:22.281911  1377 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total
I20260504 14:08:22.284339  1377 tablet_copy_client.cc:806] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: tablet copy: Starting download of 0 data blocks...
I20260504 14:08:22.284541  1377 tablet_copy_client.cc:670] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: tablet copy: Starting download of 1 WAL segments...
I20260504 14:08:22.295279  1377 tablet_copy_client.cc:538] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260504 14:08:22.299350   925 tablet_copy_service.cc:342] P 8fc8681ad972459a8b49398127c6b42c: Request end of tablet copy session 5f6047ff1b18475e986323ecedb0f8b5-00000000000000000000000000000000 received from {username='oryx', principal='oryx/127.25.254.253@KRBTEST.COM'} at 127.0.0.1:51730
I20260504 14:08:22.299525   925 tablet_copy_service.cc:434] P 8fc8681ad972459a8b49398127c6b42c: ending tablet copy session 5f6047ff1b18475e986323ecedb0f8b5-00000000000000000000000000000000 on tablet 00000000000000000000000000000000 with peer 5f6047ff1b18475e986323ecedb0f8b5
W20260504 14:08:22.305171  1377 builtin_ntp.cc:688] coult not shutdown socket: Network error: shutdown error: Transport endpoint is not connected (error 107)
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.252 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal HTTP/127.25.254.252 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab.
I20260504 14:08:22.307904  1377 file_cache.cc:492] Constructed file cache file cache with capacity 419430
I20260504 14:08:22.307925 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.252:44857
--webserver_interface=127.25.254.252
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.252
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:37991
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:41049,127.25.254.253:42377,127.25.254.252:44857
--encrypt_data_at_rest=true
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:22.309711  1495 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:22.309831  1498 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:22.310034  1496 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:22.316020  1377 server_base.cc:1061] running on GCE node
I20260504 14:08:22.316267  1377 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:22.316640  1377 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:22.317807  1377 hybrid_clock.cc:648] HybridClock initialized: now 1777903702317783 us; error 58 us; skew 500 ppm
May 04 14:08:22 dist-test-slave-2x32 krb5kdc[857](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903702, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.253@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:22.319978  1377 init.cc:377] Logged in from keytab as oryx/127.25.254.253@KRBTEST.COM (short username oryx)
I20260504 14:08:22.320629  1377 webserver.cc:492] Webserver started at http://127.25.254.253:46287/ using document root <none> and password file <none>
I20260504 14:08:22.320890  1377 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:22.320942  1377 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:22.322643  1377 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.002s
I20260504 14:08:22.323947  1506 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:22.324270  1377 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.001s
I20260504 14:08:22.324357  1377 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/wal
uuid: "5f6047ff1b18475e986323ecedb0f8b5"
format_stamp: "Formatted at 2026-05-04 14:08:22 on dist-test-slave-2x32"
server_key: "80409fd7103efcc75a24193595464447"
server_key_iv: "d6ff432579d33739888bbda93ab345f9"
server_key_version: "encryptionkey@0"
I20260504 14:08:22.324465  1377 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:22.338711  1377 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:22.340873  1377 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:22.345818  1377 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.253:42377
I20260504 14:08:22.346354  1558 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.253:42377 every 8 connection(s)
I20260504 14:08:22.346505  1377 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-1/data/info.pb
I20260504 14:08:22.348285  1559 sys_catalog.cc:263] Verifying existing consensus state
I20260504 14:08:22.349268  1559 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: Bootstrap starting.
I20260504 14:08:22.356630  1559 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: Bootstrap replayed 1/1 log segments. Stats: ops{read=5 overwritten=0 applied=5 ignored=0} inserts{seen=3 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260504 14:08:22.356990  1559 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: Bootstrap complete.
I20260504 14:08:22.357587  1559 raft_consensus.cc:359] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 1 LEARNER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: 5 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: NON_VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: true } }
I20260504 14:08:22.357721  1559 raft_consensus.cc:740] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 1 LEARNER]: Becoming Follower/Learner. State: Replica: 5f6047ff1b18475e986323ecedb0f8b5, State: Initialized, Role: LEARNER
I20260504 14:08:22.357827  1559 consensus_queue.cc:260] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 5, Last appended: 1.5, Last appended by leader: 5, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: 5 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: NON_VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: true } }
I20260504 14:08:22.358363  1559 sys_catalog.cc:565] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:08:22.358652  1562 sys_catalog.cc:455] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 committed_config { opid_index: 5 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: NON_VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: true } } }
I20260504 14:08:22.358842  1562 sys_catalog.cc:458] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: This master's current role is: LEARNER
I20260504 14:08:22.361418  1573 catalog_manager.cc:1269] Loaded cluster ID: c0935c8e513349598af8d1e724bba87c
I20260504 14:08:22.361495  1573 catalog_manager.cc:1562] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: loading cluster ID for follower catalog manager: success
I20260504 14:08:22.363770  1573 catalog_manager.cc:1584] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: acquiring CA information for follower catalog manager: success
I20260504 14:08:22.365087  1573 catalog_manager.cc:1612] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5: importing token verification keys for follower catalog manager: success; most recent TSK sequence number 0
I20260504 14:08:22.400588  1098 heartbeater.cc:499] Master 127.25.254.254:41049 was elected leader, sending a full tablet report...
W20260504 14:08:22.436954  1500 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:22.437211  1500 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:22.437266  1500 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:22.440896  1500 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:08:22.440974  1500 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:22.440999  1500 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:22.441018  1500 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:08:22.441066  1500 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:08:22.445930  1500 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:37991
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:41049,127.25.254.253:42377,127.25.254.252:44857
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.252
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.252:44857
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/data/info.pb
--webserver_interface=127.25.254.252
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.1500
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:22.447136  1500 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:22.448028  1500 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:22.453671  1581 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:22.453704  1579 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:22.453702  1578 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:22.454106  1500 server_base.cc:1061] running on GCE node
I20260504 14:08:22.454674  1500 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:22.455598  1500 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:22.456813  1500 hybrid_clock.cc:648] HybridClock initialized: now 1777903702456791 us; error 40 us; skew 500 ppm
May 04 14:08:22 dist-test-slave-2x32 krb5kdc[857](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903702, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.252@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:22.460211  1500 init.cc:377] Logged in from keytab as oryx/127.25.254.252@KRBTEST.COM (short username oryx)
I20260504 14:08:22.461390  1500 webserver.cc:492] Webserver started at http://127.25.254.252:37605/ using document root <none> and password file <none>
I20260504 14:08:22.461987  1500 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:22.462064  1500 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:22.462327  1500 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:22.464046  1500 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/data/instance:
uuid: "b782998ffee94b86a80d60ed27ff59f1"
format_stamp: "Formatted at 2026-05-04 14:08:22 on dist-test-slave-2x32"
server_key: "1b5efa031b3d4d3aa4c0ae6bbfcc4fd8"
server_key_iv: "cebcfb659453f8b17d815c645dd674d2"
server_key_version: "encryptionkey@0"
I20260504 14:08:22.464545  1500 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/wal/instance:
uuid: "b782998ffee94b86a80d60ed27ff59f1"
format_stamp: "Formatted at 2026-05-04 14:08:22 on dist-test-slave-2x32"
server_key: "1b5efa031b3d4d3aa4c0ae6bbfcc4fd8"
server_key_iv: "cebcfb659453f8b17d815c645dd674d2"
server_key_version: "encryptionkey@0"
I20260504 14:08:22.468189  1500 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.005s	sys 0.000s
I20260504 14:08:22.470625  1588 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:22.471740  1500 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:08:22.471904  1500 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/wal
uuid: "b782998ffee94b86a80d60ed27ff59f1"
format_stamp: "Formatted at 2026-05-04 14:08:22 on dist-test-slave-2x32"
server_key: "1b5efa031b3d4d3aa4c0ae6bbfcc4fd8"
server_key_iv: "cebcfb659453f8b17d815c645dd674d2"
server_key_version: "encryptionkey@0"
I20260504 14:08:22.472024  1500 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:22.484068  1500 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:22.487284  1500 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:22.487607  1500 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:22.495965  1640 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.252:44857 every 8 connection(s)
I20260504 14:08:22.495961  1500 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.252:44857
I20260504 14:08:22.497223  1500 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/data/info.pb
I20260504 14:08:22.500434  1641 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:22.504804  1641 sys_catalog.cc:422] member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } has no permanent_uuid. Determining permanent_uuid...
I20260504 14:08:22.505443 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 1500
May 04 14:08:22 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903702, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.252@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM
May 04 14:08:22 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903700, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for oryx/127.25.254.252@KRBTEST.COM
I20260504 14:08:22.517539   944 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.507030 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:51736 (local address 127.25.254.254:41049)
0504 14:08:22.507175 (+   145us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:22.507178 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:22.507555 (+   377us) server_negotiation.cc:408] Connection header received
0504 14:08:22.508577 (+  1022us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:22.508587 (+    10us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:22.508715 (+   128us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:22.508801 (+    86us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:22.509953 (+  1152us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.510528 (+   575us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.511176 (+   648us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.511369 (+   193us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.513825 (+  2456us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:22.513846 (+    21us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:22.513848 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:22.513877 (+    29us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:22.515382 (+  1505us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.515951 (+   569us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.515955 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.515957 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.516063 (+   106us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.516420 (+   357us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.516424 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.516426 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.516585 (+   159us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:22.516726 (+   141us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:22.517239 (+   513us) server_negotiation.cc:300] Negotiation successful
0504 14:08:22.517366 (+   127us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":50}
I20260504 14:08:22.518450  1643 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.506921 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:35144 (local address 127.25.254.252:44857)
0504 14:08:22.507744 (+   823us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:22.507751 (+     7us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:22.507783 (+    32us) server_negotiation.cc:408] Connection header received
0504 14:08:22.508305 (+   522us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:22.508332 (+    27us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:22.508580 (+   248us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:22.508916 (+   336us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:22.509775 (+   859us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.510594 (+   819us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.511255 (+   661us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.511462 (+   207us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.514243 (+  2781us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:22.514272 (+    29us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:22.514287 (+    15us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:22.514315 (+    28us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:22.516246 (+  1931us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.516713 (+   467us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.516724 (+    11us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.516726 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.516770 (+    44us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.517051 (+   281us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.517055 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.517057 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.517235 (+   178us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:22.517427 (+   192us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:22.517628 (+   201us) server_negotiation.cc:300] Negotiation successful
0504 14:08:22.517767 (+   139us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":711,"thread_start_us":173,"threads_started":1}
I20260504 14:08:22.518450  1644 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.506921 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.0.0.1:51736)
0504 14:08:22.507420 (+   499us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.507453 (+    33us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:22.508305 (+   852us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:22.508910 (+   605us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:22.508915 (+     5us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:22.509400 (+   485us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:22.509785 (+   385us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.509799 (+    14us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.510634 (+   835us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.510640 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:22.511061 (+   421us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.511068 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.511257 (+   189us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.511821 (+   564us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:22.511846 (+    25us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:22.513598 (+  1752us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:22.515504 (+  1906us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.515513 (+     9us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.515522 (+     9us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.515827 (+   305us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.516193 (+   366us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.516197 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.516201 (+     4us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.516335 (+   134us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.516725 (+   390us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:22.516731 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:22.516979 (+   248us) client_negotiation.cc:770] Sending connection context
0504 14:08:22.517191 (+   212us) client_negotiation.cc:241] Negotiation successful
0504 14:08:22.517429 (+   238us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":337,"spinlock_wait_cycles":75520,"thread_start_us":173,"threads_started":1}
I20260504 14:08:22.520527  1641 sys_catalog.cc:422] member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } has no permanent_uuid. Determining permanent_uuid...
May 04 14:08:22 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903702, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.252@KRBTEST.COM for oryx/127.25.254.253@KRBTEST.COM
I20260504 14:08:22.529592  1644 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.521056 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.253:42377 (local address 127.0.0.1:42752)
0504 14:08:22.521190 (+   134us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.521204 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:22.521296 (+    92us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:22.521639 (+   343us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:22.521643 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:22.522039 (+   396us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:22.522314 (+   275us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.522320 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.523429 (+  1109us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.523435 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:22.523818 (+   383us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.523825 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.523951 (+   126us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.524553 (+   602us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:22.524568 (+    15us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:22.525878 (+  1310us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:22.528200 (+  2322us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.528203 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.528205 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.528381 (+   176us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.528676 (+   295us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.528679 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.528680 (+     1us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.528726 (+    46us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.529150 (+   424us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:22.529153 (+     3us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:22.529233 (+    80us) client_negotiation.cc:770] Sending connection context
0504 14:08:22.529331 (+    98us) client_negotiation.cc:241] Negotiation successful
0504 14:08:22.529444 (+   113us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":55}
I20260504 14:08:22.529702  1645 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.521153 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:42752 (local address 127.25.254.253:42377)
0504 14:08:22.521436 (+   283us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:22.521439 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:22.521451 (+    12us) server_negotiation.cc:408] Connection header received
0504 14:08:22.521485 (+    34us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:22.521488 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:22.521529 (+    41us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:22.521627 (+    98us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:22.522523 (+   896us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.523301 (+   778us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.523949 (+   648us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.524158 (+   209us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.526032 (+  1874us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:22.526051 (+    19us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:22.526054 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:22.526080 (+    26us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:22.528050 (+  1970us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.528525 (+   475us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.528529 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.528530 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.528580 (+    50us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.528856 (+   276us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.528859 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.528861 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.529018 (+   157us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:22.529115 (+    97us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:22.529421 (+   306us) server_negotiation.cc:300] Negotiation successful
0504 14:08:22.529537 (+   116us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":206,"thread_start_us":93,"threads_started":1}
I20260504 14:08:22.530649  1641 sys_catalog.cc:422] member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } has no permanent_uuid. Determining permanent_uuid...
May 04 14:08:22 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903702, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.252@KRBTEST.COM for oryx/127.25.254.252@KRBTEST.COM
I20260504 14:08:22.538683  1644 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.531330 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.252:44857 (local address 127.0.0.1:35158)
0504 14:08:22.531471 (+   141us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.531488 (+    17us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:22.531558 (+    70us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:22.531854 (+   296us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:22.531857 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:22.532006 (+   149us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:22.532176 (+   170us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.532181 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.533005 (+   824us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.533008 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:22.533360 (+   352us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.533366 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.533486 (+   120us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.534085 (+   599us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:22.534098 (+    13us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:22.535429 (+  1331us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:22.537140 (+  1711us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.537144 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.537146 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.537301 (+   155us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.537521 (+   220us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.537523 (+     2us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.537525 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.537569 (+    44us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.538239 (+   670us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:22.538241 (+     2us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:22.538317 (+    76us) client_negotiation.cc:770] Sending connection context
0504 14:08:22.538409 (+    92us) client_negotiation.cc:241] Negotiation successful
0504 14:08:22.538505 (+    96us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":54}
I20260504 14:08:22.538854  1643 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.531417 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:35158 (local address 127.25.254.252:44857)
0504 14:08:22.531536 (+   119us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:22.531539 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:22.531551 (+    12us) server_negotiation.cc:408] Connection header received
0504 14:08:22.531681 (+   130us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:22.531684 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:22.531724 (+    40us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:22.531797 (+    73us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:22.532308 (+   511us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.532863 (+   555us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.533546 (+   683us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.533852 (+   306us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.535572 (+  1720us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:22.535594 (+    22us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:22.535596 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:22.535617 (+    21us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:22.537031 (+  1414us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.537388 (+   357us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.537391 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.537392 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.537428 (+    36us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.537759 (+   331us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.537763 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.537764 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.538082 (+   318us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:22.538251 (+   169us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:22.538510 (+   259us) server_negotiation.cc:300] Negotiation successful
0504 14:08:22.538603 (+    93us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":55}
I20260504 14:08:22.541566  1641 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: Bootstrap starting.
I20260504 14:08:22.544147  1641 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:22.544955  1641 log.cc:826] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:22.547005  1641 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: No bootstrap required, opened a new log
I20260504 14:08:22.549572  1641 raft_consensus.cc:359] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } }
I20260504 14:08:22.549739  1641 raft_consensus.cc:385] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:22.549778  1641 raft_consensus.cc:740] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: b782998ffee94b86a80d60ed27ff59f1, State: Initialized, Role: FOLLOWER
I20260504 14:08:22.550247  1641 consensus_queue.cc:260] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } }
I20260504 14:08:22.550832  1647 sys_catalog.cc:455] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 0 committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } } }
I20260504 14:08:22.550983  1647 sys_catalog.cc:458] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [sys.catalog]: This master's current role is: FOLLOWER
I20260504 14:08:22.551451  1641 sys_catalog.cc:565] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [sys.catalog]: configured and running, proceeding with master startup.
W20260504 14:08:22.556695  1658 catalog_manager.cc:1568] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: loading cluster ID for follower catalog manager: Not found: cluster ID entry not found
W20260504 14:08:22.556803  1658 catalog_manager.cc:883] Not found: cluster ID entry not found: failed to prepare follower catalog manager, will retry
I20260504 14:08:22.560302  1500 master_runner.cc:428] Detected that this master b782998ffee94b86a80d60ed27ff59f1 is joining an existing cluster
I20260504 14:08:22.560422  1500 master_runner.cc:432] Initiating AddMaster RPC to add 127.25.254.252:44857
I20260504 14:08:22.576225  1470 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.573231 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.0.0.1:51746)
0504 14:08:22.573378 (+   147us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.573399 (+    21us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:22.573476 (+    77us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:22.574132 (+   656us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:22.574135 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:22.574177 (+    42us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:22.574359 (+   182us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.574364 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.575127 (+   763us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.575131 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:22.575750 (+   619us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.575761 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.575880 (+   119us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.575934 (+    54us) client_negotiation.cc:770] Sending connection context
0504 14:08:22.576006 (+    72us) client_negotiation.cc:241] Negotiation successful
0504 14:08:22.576080 (+    74us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":58}
I20260504 14:08:22.576876  1667 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.573576 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:51746 (local address 127.25.254.254:41049)
0504 14:08:22.573889 (+   313us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:22.573926 (+    37us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:22.573937 (+    11us) server_negotiation.cc:408] Connection header received
0504 14:08:22.573981 (+    44us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:22.573983 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:22.574019 (+    36us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:22.574116 (+    97us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:22.574470 (+   354us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.575007 (+   537us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.575902 (+   895us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.576542 (+   640us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.576606 (+    64us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:22.576662 (+    56us) server_negotiation.cc:300] Negotiation successful
0504 14:08:22.576729 (+    67us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":258,"thread_start_us":157,"threads_started":1}
I20260504 14:08:22.582352  1470 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.578251 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.253:42377 (local address 127.0.0.1:42772)
0504 14:08:22.578678 (+   427us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.578691 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:22.578802 (+   111us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:22.579146 (+   344us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:22.579149 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:22.579174 (+    25us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:22.579393 (+   219us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.579400 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.580493 (+  1093us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.580496 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:22.581939 (+  1443us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.581949 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.582036 (+    87us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.582049 (+    13us) client_negotiation.cc:770] Sending connection context
0504 14:08:22.582091 (+    42us) client_negotiation.cc:241] Negotiation successful
0504 14:08:22.582193 (+   102us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":288}
I20260504 14:08:22.584961  1668 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.578357 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:42772 (local address 127.25.254.253:42377)
0504 14:08:22.578623 (+   266us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:22.578628 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:22.578757 (+   129us) server_negotiation.cc:408] Connection header received
0504 14:08:22.578919 (+   162us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:22.578923 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:22.578994 (+    71us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:22.579503 (+   509us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:22.579601 (+    98us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.580370 (+   769us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.583186 (+  2816us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.583953 (+   767us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.584391 (+   438us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:22.584460 (+    69us) server_negotiation.cc:300] Negotiation successful
0504 14:08:22.584535 (+    75us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":194,"thread_start_us":112,"threads_started":1}
I20260504 14:08:22.586081  1666 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.563318 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.252:44857 (local address 127.0.0.1:35160)
0504 14:08:22.563622 (+   304us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.563637 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:22.563717 (+    80us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:22.564021 (+   304us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:22.564024 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:22.578416 (+ 14392us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:22.579324 (+   908us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.579336 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.580331 (+   995us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.580335 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:22.580911 (+   576us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.580919 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.581031 (+   112us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.581778 (+   747us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:22.581798 (+    20us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:22.582395 (+   597us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:22.584659 (+  2264us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.584663 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.584665 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.584889 (+   224us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.585121 (+   232us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.585124 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.585126 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.585171 (+    45us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.585548 (+   377us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:22.585551 (+     3us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:22.585623 (+    72us) client_negotiation.cc:770] Sending connection context
0504 14:08:22.585742 (+   119us) client_negotiation.cc:241] Negotiation successful
0504 14:08:22.585882 (+   140us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":211,"thread_start_us":115,"threads_started":1}
I20260504 14:08:22.586633  1643 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.563622 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:35160 (local address 127.25.254.252:44857)
0504 14:08:22.563734 (+   112us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:22.563740 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:22.563754 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:08:22.563789 (+    35us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:22.563792 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:22.563832 (+    40us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:22.563925 (+    93us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:22.579481 (+ 15556us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.580199 (+   718us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.581278 (+  1079us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.581458 (+   180us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.582538 (+  1080us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:22.582557 (+    19us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:22.582561 (+     4us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:22.582586 (+    25us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:22.584529 (+  1943us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.584990 (+   461us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.584994 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.584996 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.585035 (+    39us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.585253 (+   218us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.585256 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.585259 (+     3us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.585424 (+   165us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:22.585514 (+    90us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:22.586331 (+   817us) server_negotiation.cc:300] Negotiation successful
0504 14:08:22.586447 (+   116us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":46,"spinlock_wait_cycles":15616}
I20260504 14:08:22.587219   944 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.564126 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:51740 (local address 127.25.254.254:41049)
0504 14:08:22.564269 (+   143us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:22.564273 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:22.564285 (+    12us) server_negotiation.cc:408] Connection header received
0504 14:08:22.564322 (+    37us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:22.564324 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:22.564364 (+    40us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:22.564458 (+    94us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:22.579509 (+ 15051us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.580270 (+   761us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.581171 (+   901us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.581351 (+   180us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.583802 (+  2451us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:22.583820 (+    18us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:22.583824 (+     4us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:22.583851 (+    27us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:22.585473 (+  1622us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.585936 (+   463us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.585940 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.585942 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.585996 (+    54us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.586292 (+   296us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.586295 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.586297 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.586458 (+   161us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:22.586536 (+    78us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:22.586953 (+   417us) server_negotiation.cc:300] Negotiation successful
0504 14:08:22.587061 (+   108us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":59}
I20260504 14:08:22.587718  1664 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.562835 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.0.0.1:51740)
0504 14:08:22.563181 (+   346us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.563196 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:22.563292 (+    96us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:22.564472 (+  1180us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:22.564476 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:22.579080 (+ 14604us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:22.579366 (+   286us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.579373 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.580419 (+  1046us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.580425 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:22.581037 (+   612us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.581045 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.581474 (+   429us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.582291 (+   817us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:22.582306 (+    15us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:22.583668 (+  1362us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:22.585599 (+  1931us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.585602 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.585604 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.585811 (+   207us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.586106 (+   295us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.586110 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.586112 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.586189 (+    77us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.586645 (+   456us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:22.586651 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:22.586749 (+    98us) client_negotiation.cc:770] Sending connection context
0504 14:08:22.587394 (+   645us) client_negotiation.cc:241] Negotiation successful
0504 14:08:22.587530 (+   136us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":242,"thread_start_us":136,"threads_started":1}
I20260504 14:08:22.597162  1645 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.562988 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:42756 (local address 127.25.254.253:42377)
0504 14:08:22.563665 (+   677us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:22.563669 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:22.563684 (+    15us) server_negotiation.cc:408] Connection header received
0504 14:08:22.563727 (+    43us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:22.563730 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:22.563787 (+    57us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:22.563873 (+    86us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:22.582734 (+ 18861us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.583520 (+   786us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.588884 (+  5364us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.589103 (+   219us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.591418 (+  2315us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:22.591439 (+    21us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:22.591442 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:22.591469 (+    27us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:22.594055 (+  2586us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.594494 (+   439us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.594501 (+     7us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.594503 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.594553 (+    50us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.594826 (+   273us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.594829 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.594830 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.595020 (+   190us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:22.595122 (+   102us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:22.596807 (+  1685us) server_negotiation.cc:300] Negotiation successful
0504 14:08:22.596955 (+   148us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":586}
May 04 14:08:22 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903701, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.254@KRBTEST.COM for oryx/127.25.254.252@KRBTEST.COM
I20260504 14:08:22.597699  1665 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.562901 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.253:42377 (local address 127.0.0.1:42756)
0504 14:08:22.563296 (+   395us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.563310 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:22.563398 (+    88us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:22.563899 (+   501us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:22.563904 (+     5us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:22.581830 (+ 17926us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:22.582094 (+   264us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.582101 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.584203 (+  2102us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.584208 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:22.588565 (+  4357us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.588575 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.589241 (+   666us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.589961 (+   720us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:22.589979 (+    18us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:22.591275 (+  1296us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:22.594197 (+  2922us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.594202 (+     5us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.594204 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.594385 (+   181us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.594667 (+   282us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.594671 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.594675 (+     4us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.594724 (+    49us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.595880 (+  1156us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:22.595885 (+     5us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:22.596029 (+   144us) client_negotiation.cc:770] Sending connection context
0504 14:08:22.597365 (+  1336us) client_negotiation.cc:241] Negotiation successful
0504 14:08:22.597516 (+   151us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":320,"thread_start_us":115,"threads_started":1}
I20260504 14:08:22.601512  1470 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.592361 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.252:44857 (local address 127.0.0.1:35170)
0504 14:08:22.592843 (+   482us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.592869 (+    26us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:22.592973 (+   104us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:22.593313 (+   340us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:22.593319 (+     6us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:22.593514 (+   195us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:22.593765 (+   251us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.593775 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.595172 (+  1397us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.595175 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:22.595545 (+   370us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.595550 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.595642 (+    92us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.596130 (+   488us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:22.596146 (+    16us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:22.597699 (+  1553us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:22.599998 (+  2299us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.600002 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.600004 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.600183 (+   179us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.600460 (+   277us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.600463 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.600465 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.600522 (+    57us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.601061 (+   539us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:22.601067 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:22.601166 (+    99us) client_negotiation.cc:770] Sending connection context
0504 14:08:22.601268 (+   102us) client_negotiation.cc:241] Negotiation successful
0504 14:08:22.601387 (+   119us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":398}
I20260504 14:08:22.601594  1643 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.592662 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:35170 (local address 127.25.254.252:44857)
0504 14:08:22.592807 (+   145us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:22.592811 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:22.592945 (+   134us) server_negotiation.cc:408] Connection header received
0504 14:08:22.593114 (+   169us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:22.593118 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:22.593174 (+    56us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:22.593255 (+    81us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:22.593949 (+   694us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.594703 (+   754us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.595790 (+  1087us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.596002 (+   212us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.597857 (+  1855us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:22.597877 (+    20us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:22.597880 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:22.597908 (+    28us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:22.599861 (+  1953us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.600287 (+   426us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.600291 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.600293 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.600341 (+    48us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.600631 (+   290us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.600634 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.600636 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.600810 (+   174us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:22.600918 (+   108us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:22.601338 (+   420us) server_negotiation.cc:300] Negotiation successful
0504 14:08:22.601446 (+   108us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":56}
I20260504 14:08:22.603212   904 catalog_manager.cc:7158] Initiating ChangeConfig request to add master 127.25.254.252:44857: tablet_id: "00000000000000000000000000000000"
type: ADD_PEER
server {
  permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1"
  member_type: NON_VOTER
  last_known_addr {
    host: "127.25.254.252"
    port: 44857
  }
  attrs {
    promote: true
  }
}
dest_uuid: "8fc8681ad972459a8b49398127c6b42c"
cas_config_opid_index: 5
I20260504 14:08:22.603991   904 consensus_queue.cc:237] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5, Committed index: 5, Last appended: 1.5, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: 6 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: NON_VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: true } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: NON_VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: true } }
I20260504 14:08:22.605162  1474 raft_consensus.cc:2955] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [term 1 LEADER]: Committing config change with OpId 1.6: config changed from index 5 to 6, NON_VOTER b782998ffee94b86a80d60ed27ff59f1 (127.25.254.252) added. New config: { opid_index: 6 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: NON_VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: true } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: NON_VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: true } } }
I20260504 14:08:22.605733  1533 raft_consensus.cc:1275] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 1 LEARNER]: Refusing update from remote peer 8fc8681ad972459a8b49398127c6b42c: Log matching property violated. Preceding OpId in replica: term: 1 index: 5. Preceding OpId from leader: term: 1 index: 6. (index mismatch)
I20260504 14:08:22.606444  1615 raft_consensus.cc:3060] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:22.606942  1474 catalog_manager.cc:7162] Successfully completed master ChangeConfig request to add master 127.25.254.252:44857
I20260504 14:08:22.607122  1474 consensus_queue.cc:1048] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [LEADER]: Connected to new peer: Peer: permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: NON_VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: true }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6, Last known committed idx: 5, Time since last communication: 0.000s
I20260504 14:08:22.607461  1474 sys_catalog.cc:455] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: SysCatalogTable state changed. Reason: Config change replication complete. Latest consensus state: current_term: 1 leader_uuid: "8fc8681ad972459a8b49398127c6b42c" committed_config { opid_index: 6 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: NON_VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: true } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: NON_VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: true } } }
I20260504 14:08:22.607563  1474 sys_catalog.cc:458] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:22.607975  1615 raft_consensus.cc:1275] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 1 FOLLOWER]: Refusing update from remote peer 8fc8681ad972459a8b49398127c6b42c: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 6. (index mismatch)
I20260504 14:08:22.608320  1647 raft_consensus.cc:493] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 1 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:08:22.608294  1533 raft_consensus.cc:2955] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 1 LEARNER]: Committing config change with OpId 1.6: config changed from index 5 to 6, NON_VOTER b782998ffee94b86a80d60ed27ff59f1 (127.25.254.252) added. New config: { opid_index: 6 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: NON_VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: true } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: NON_VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: true } } }
I20260504 14:08:22.608454  1647 raft_consensus.cc:515] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } }
I20260504 14:08:22.608853  1500 master.cc:562] Master@127.25.254.252:44857 shutting down...
I20260504 14:08:22.609431  1647 leader_election.cc:290] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 8fc8681ad972459a8b49398127c6b42c (127.25.254.254:41049), 5f6047ff1b18475e986323ecedb0f8b5 (127.25.254.253:42377)
I20260504 14:08:22.609891  1562 sys_catalog.cc:455] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 8fc8681ad972459a8b49398127c6b42c. Latest consensus state: current_term: 1 leader_uuid: "8fc8681ad972459a8b49398127c6b42c" committed_config { opid_index: 6 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: NON_VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: true } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: NON_VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: true } } }
I20260504 14:08:22.610009  1562 sys_catalog.cc:458] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: This master's current role is: LEARNER
I20260504 14:08:22.610126  1562 sys_catalog.cc:455] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: SysCatalogTable state changed. Reason: Replicated consensus-only round. Latest consensus state: current_term: 1 leader_uuid: "8fc8681ad972459a8b49398127c6b42c" committed_config { opid_index: 6 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: NON_VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: true } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: NON_VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: true } } }
I20260504 14:08:22.610085  1532 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "00000000000000000000000000000000" candidate_uuid: "b782998ffee94b86a80d60ed27ff59f1" candidate_term: 2 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "5f6047ff1b18475e986323ecedb0f8b5" is_pre_election: true
I20260504 14:08:22.610239  1562 sys_catalog.cc:458] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: This master's current role is: LEARNER
I20260504 14:08:22.610948   915 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "00000000000000000000000000000000" candidate_uuid: "b782998ffee94b86a80d60ed27ff59f1" candidate_term: 2 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "8fc8681ad972459a8b49398127c6b42c" is_pre_election: true
I20260504 14:08:22.611307  1475 consensus_queue.cc:1048] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [LEADER]: Connected to new peer: Peer: permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: NON_VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: true }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:08:22.611449  1591 leader_election.cc:304] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: b782998ffee94b86a80d60ed27ff59f1; no voters: 5f6047ff1b18475e986323ecedb0f8b5, 8fc8681ad972459a8b49398127c6b42c
I20260504 14:08:22.611940  1647 raft_consensus.cc:2749] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20260504 14:08:22.613075  1475 raft_consensus.cc:1064] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c: attempting to promote NON_VOTER 5f6047ff1b18475e986323ecedb0f8b5 to VOTER
I20260504 14:08:22.613507  1475 consensus_queue.cc:237] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6, Committed index: 6, Last appended: 1.6, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 7 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: NON_VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: true } }
W20260504 14:08:22.613654   891 proxy.cc:239] Call had error, refreshing address and retrying: Remote error: Service unavailable: service kudu.consensus.ConsensusService not registered on Master
I20260504 14:08:22.614833  1533 raft_consensus.cc:1275] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 1 LEARNER]: Refusing update from remote peer 8fc8681ad972459a8b49398127c6b42c: Log matching property violated. Preceding OpId in replica: term: 1 index: 6. Preceding OpId from leader: term: 1 index: 7. (index mismatch)
I20260504 14:08:22.615229  1669 consensus_queue.cc:1048] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [LEADER]: Connected to new peer: Peer: permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 7, Last known committed idx: 6, Time since last communication: 0.000s
I20260504 14:08:22.615343  1500 raft_consensus.cc:2243] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:08:22.615577  1500 raft_consensus.cc:2272] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:22.615675  1500 tablet_replica.cc:333] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: stopping tablet replica
W20260504 14:08:22.615823   891 consensus_peers.cc:597] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c -> Peer b782998ffee94b86a80d60ed27ff59f1 (127.25.254.252:44857): Couldn't send request to peer b782998ffee94b86a80d60ed27ff59f1. Status: Remote error: Service unavailable: service kudu.consensus.ConsensusService not registered on Master. This is attempt 1: this message will repeat every 5th retry.
I20260504 14:08:22.617030  1474 raft_consensus.cc:2955] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [term 1 LEADER]: Committing config change with OpId 1.7: config changed from index 6 to 7, 5f6047ff1b18475e986323ecedb0f8b5 (127.25.254.253) changed from NON_VOTER to VOTER. New config: { opid_index: 7 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: NON_VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: true } } }
I20260504 14:08:22.617444  1533 raft_consensus.cc:2955] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 1 FOLLOWER]: Committing config change with OpId 1.7: config changed from index 6 to 7, 5f6047ff1b18475e986323ecedb0f8b5 (127.25.254.253) changed from NON_VOTER to VOTER. New config: { opid_index: 7 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: NON_VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: true } } }
I20260504 14:08:22.618808  1475 sys_catalog.cc:455] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: SysCatalogTable state changed. Reason: Config change replication complete. Latest consensus state: current_term: 1 leader_uuid: "8fc8681ad972459a8b49398127c6b42c" committed_config { opid_index: 7 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: NON_VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: true } } }
I20260504 14:08:22.618921  1475 sys_catalog.cc:458] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:22.619062  1671 sys_catalog.cc:455] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: SysCatalogTable state changed. Reason: Replicated consensus-only round. Latest consensus state: current_term: 1 leader_uuid: "8fc8681ad972459a8b49398127c6b42c" committed_config { opid_index: 7 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: NON_VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: true } } }
I20260504 14:08:22.619163  1671 sys_catalog.cc:458] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: This master's current role is: FOLLOWER
I20260504 14:08:22.619316  1475 sys_catalog.cc:455] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: SysCatalogTable state changed. Reason: Peer health change. Latest consensus state: current_term: 1 leader_uuid: "8fc8681ad972459a8b49398127c6b42c" committed_config { opid_index: 7 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: NON_VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: true } } }
I20260504 14:08:22.619395  1475 sys_catalog.cc:458] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:22.624225  1470 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.623944 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.252:44857 (local address 127.0.0.1:35182)
0504 14:08:22.624092 (+   148us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.624148 (+    56us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.252:44857: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":51}
W20260504 14:08:22.624456   905 master.cc:680] Network error: unable to get registration information for peer b782998ffee94b86a80d60ed27ff59f1: Client connection negotiation failed: client connection to 127.25.254.252:44857: connect: Connection refused (error 111)
I20260504 14:08:22.631919  1500 master.cc:584] Master@127.25.254.252:44857 shutdown complete.
I20260504 14:08:22.632002  1500 master_runner.cc:305] Clearing existing system tablet
I20260504 14:08:22.632213  1500 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:22.632288  1500 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:22.633559  1500 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.002s	sys 0.000s
I20260504 14:08:22.634797  1675 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:22.635032  1500 fs_report.cc:389] FS layout report
--------------------
wal directory: 
metadata directory: 
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:22.635190  1500 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:08:22.635233  1500 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/wal
uuid: "b782998ffee94b86a80d60ed27ff59f1"
format_stamp: "Formatted at 2026-05-04 14:08:22 on dist-test-slave-2x32"
server_key: "1b5efa031b3d4d3aa4c0ae6bbfcc4fd8"
server_key_iv: "cebcfb659453f8b17d815c645dd674d2"
server_key_version: "encryptionkey@0"
I20260504 14:08:22.636955  1500 ts_tablet_manager.cc:1916] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:22.638833  1500 ts_tablet_manager.cc:1929] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId unknown
I20260504 14:08:22.638919  1500 log.cc:1199] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/wal/wals/00000000000000000000000000000000
I20260504 14:08:22.639175  1500 ts_tablet_manager.cc:1950] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: Deleting consensus metadata
I20260504 14:08:22.639364  1500 master_runner.cc:315] Copying system tablet from 127.25.254.254:41049
I20260504 14:08:22.640558  1500 tablet_copy_client.cc:323] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: tablet copy: Beginning tablet copy session from remote peer at address 127.25.254.254:41049
I20260504 14:08:22.645363  1470 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.645038 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.252:44857 (local address 127.0.0.1:35192)
0504 14:08:22.645210 (+   172us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.645269 (+    59us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.252:44857: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":60}
W20260504 14:08:22.645614   905 master.cc:680] Network error: unable to get registration information for peer b782998ffee94b86a80d60ed27ff59f1: Client connection negotiation failed: client connection to 127.25.254.252:44857: connect: Connection refused (error 111)
I20260504 14:08:22.648743   944 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.641248 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:51748 (local address 127.25.254.254:41049)
0504 14:08:22.641391 (+   143us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:22.641394 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:22.641488 (+    94us) server_negotiation.cc:408] Connection header received
0504 14:08:22.641631 (+   143us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:22.641634 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:22.641683 (+    49us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:22.641756 (+    73us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:22.642806 (+  1050us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.643504 (+   698us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.644369 (+   865us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.644608 (+   239us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.645659 (+  1051us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:22.645675 (+    16us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:22.645679 (+     4us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:22.645712 (+    33us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:22.647274 (+  1562us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.647692 (+   418us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.647696 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.647700 (+     4us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.647745 (+    45us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:22.647990 (+   245us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:22.647994 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:22.647996 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:22.648139 (+   143us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:22.648243 (+   104us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:22.648496 (+   253us) server_negotiation.cc:300] Negotiation successful
0504 14:08:22.648595 (+    99us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":60}
I20260504 14:08:22.648763  1681 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.641084 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.0.0.1:51748)
0504 14:08:22.641387 (+   303us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.641403 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:22.641535 (+   132us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:22.641799 (+   264us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:22.641803 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:22.642072 (+   269us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:22.642665 (+   593us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.642675 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.643656 (+   981us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.643659 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:22.644235 (+   576us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.644244 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.644375 (+   131us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.645060 (+   685us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:22.645099 (+    39us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:22.645538 (+   439us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:22.647402 (+  1864us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.647406 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.647408 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.647584 (+   176us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.647847 (+   263us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:22.647850 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:22.647852 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:22.647898 (+    46us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:22.648233 (+   335us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:22.648236 (+     3us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:22.648321 (+    85us) client_negotiation.cc:770] Sending connection context
0504 14:08:22.648432 (+   111us) client_negotiation.cc:241] Negotiation successful
0504 14:08:22.648573 (+   141us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":204,"thread_start_us":127,"threads_started":1}
I20260504 14:08:22.649439   925 tablet_copy_service.cc:140] P 8fc8681ad972459a8b49398127c6b42c: Received BeginTabletCopySession request for tablet 00000000000000000000000000000000 from peer b782998ffee94b86a80d60ed27ff59f1 ({username='oryx', principal='oryx/127.25.254.252@KRBTEST.COM'} at 127.0.0.1:51748)
I20260504 14:08:22.649518   925 tablet_copy_service.cc:161] P 8fc8681ad972459a8b49398127c6b42c: Beginning new tablet copy session on tablet 00000000000000000000000000000000 from peer b782998ffee94b86a80d60ed27ff59f1 at {username='oryx', principal='oryx/127.25.254.252@KRBTEST.COM'} at 127.0.0.1:51748: session id = b782998ffee94b86a80d60ed27ff59f1-00000000000000000000000000000000
I20260504 14:08:22.649853  1234 heartbeater.cc:499] Master 127.25.254.254:41049 was elected leader, sending a full tablet report...
I20260504 14:08:22.650750   925 tablet_copy_source_session.cc:215] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c: Tablet Copy: opened 0 blocks and 1 log segments
I20260504 14:08:22.652019  1500 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total
I20260504 14:08:22.654021  1500 tablet_copy_client.cc:806] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: tablet copy: Starting download of 0 data blocks...
I20260504 14:08:22.654240  1500 tablet_copy_client.cc:670] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: tablet copy: Starting download of 1 WAL segments...
I20260504 14:08:22.655958  1500 tablet_copy_client.cc:538] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260504 14:08:22.657753   925 tablet_copy_service.cc:342] P 8fc8681ad972459a8b49398127c6b42c: Request end of tablet copy session b782998ffee94b86a80d60ed27ff59f1-00000000000000000000000000000000 received from {username='oryx', principal='oryx/127.25.254.252@KRBTEST.COM'} at 127.0.0.1:51748
I20260504 14:08:22.657837   925 tablet_copy_service.cc:434] P 8fc8681ad972459a8b49398127c6b42c: ending tablet copy session b782998ffee94b86a80d60ed27ff59f1-00000000000000000000000000000000 on tablet 00000000000000000000000000000000 with peer b782998ffee94b86a80d60ed27ff59f1
W20260504 14:08:22.662806  1500 builtin_ntp.cc:688] coult not shutdown socket: Network error: shutdown error: Transport endpoint is not connected (error 107)
I20260504 14:08:22.664912  1500 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:22.666599  1682 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:22.666927  1685 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:22.666937  1683 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:22.667271  1500 server_base.cc:1061] running on GCE node
I20260504 14:08:22.667440  1500 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:22.667737  1500 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:22.668885  1500 hybrid_clock.cc:648] HybridClock initialized: now 1777903702668869 us; error 36 us; skew 500 ppm
May 04 14:08:22 dist-test-slave-2x32 krb5kdc[857](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903702, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.252@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:22.670607  1500 init.cc:377] Logged in from keytab as oryx/127.25.254.252@KRBTEST.COM (short username oryx)
I20260504 14:08:22.671319  1500 webserver.cc:492] Webserver started at http://127.25.254.252:40545/ using document root <none> and password file <none>
I20260504 14:08:22.671525  1500 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:22.671561  1500 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:22.672770  1500 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:08:22.673732  1692 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:22.673975  1500 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:08:22.674042  1500 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/wal
uuid: "b782998ffee94b86a80d60ed27ff59f1"
format_stamp: "Formatted at 2026-05-04 14:08:22 on dist-test-slave-2x32"
server_key: "1b5efa031b3d4d3aa4c0ae6bbfcc4fd8"
server_key_iv: "cebcfb659453f8b17d815c645dd674d2"
server_key_version: "encryptionkey@0"
I20260504 14:08:22.674139  1500 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:22.681702  1470 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.681394 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.252:44857 (local address 127.0.0.1:35206)
0504 14:08:22.681543 (+   149us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.681604 (+    61us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.252:44857: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":55}
W20260504 14:08:22.682040   905 master.cc:680] Network error: unable to get registration information for peer b782998ffee94b86a80d60ed27ff59f1: Client connection negotiation failed: client connection to 127.25.254.252:44857: connect: Connection refused (error 111)
I20260504 14:08:22.687083  1500 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:22.689186  1500 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:22.694732  1500 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.252:44857
I20260504 14:08:22.695027  1744 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.252:44857 every 8 connection(s)
I20260504 14:08:22.695446  1500 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/master-2/data/info.pb
I20260504 14:08:22.696911  1745 sys_catalog.cc:263] Verifying existing consensus state
I20260504 14:08:22.697762  1745 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: Bootstrap starting.
I20260504 14:08:22.707086  1745 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: Bootstrap replayed 1/1 log segments. Stats: ops{read=7 overwritten=0 applied=7 ignored=0} inserts{seen=3 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260504 14:08:22.707422  1745 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: Bootstrap complete.
I20260504 14:08:22.708038  1745 raft_consensus.cc:359] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 1 LEARNER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: 7 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: NON_VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: true } }
I20260504 14:08:22.708184  1745 raft_consensus.cc:740] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 1 LEARNER]: Becoming Follower/Learner. State: Replica: b782998ffee94b86a80d60ed27ff59f1, State: Initialized, Role: LEARNER
I20260504 14:08:22.708345  1745 consensus_queue.cc:260] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 7, Last appended: 1.7, Last appended by leader: 7, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: 7 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: NON_VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: true } }
I20260504 14:08:22.708811  1748 sys_catalog.cc:455] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 committed_config { opid_index: 7 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: NON_VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: true } } }
I20260504 14:08:22.708920  1748 sys_catalog.cc:458] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [sys.catalog]: This master's current role is: LEARNER
I20260504 14:08:22.708930  1745 sys_catalog.cc:565] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:08:22.712157  1759 catalog_manager.cc:1269] Loaded cluster ID: c0935c8e513349598af8d1e724bba87c
I20260504 14:08:22.712214  1759 catalog_manager.cc:1562] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: loading cluster ID for follower catalog manager: success
I20260504 14:08:22.714080  1759 catalog_manager.cc:1584] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: acquiring CA information for follower catalog manager: success
I20260504 14:08:22.715190  1759 catalog_manager.cc:1612] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: importing token verification keys for follower catalog manager: success; most recent TSK sequence number 0
I20260504 14:08:22.753815  1470 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.750419 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.252:44857 (local address 127.0.0.1:35222)
0504 14:08:22.750571 (+   152us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:22.750587 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:22.750670 (+    83us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:22.751154 (+   484us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:22.751157 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:22.751173 (+    16us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:22.751415 (+   242us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.751426 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.752452 (+  1026us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.752456 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:22.753421 (+   965us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:22.753430 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.753544 (+   114us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.753570 (+    26us) client_negotiation.cc:770] Sending connection context
0504 14:08:22.753618 (+    48us) client_negotiation.cc:241] Negotiation successful
0504 14:08:22.753672 (+    54us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":58}
I20260504 14:08:22.754546  1760 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:22.750516 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:35222 (local address 127.25.254.252:44857)
0504 14:08:22.750903 (+   387us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:22.750907 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:22.750923 (+    16us) server_negotiation.cc:408] Connection header received
0504 14:08:22.750977 (+    54us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:22.750980 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:22.751030 (+    50us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:22.751125 (+    95us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:22.751579 (+   454us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.752335 (+   756us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:22.753568 (+  1233us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:22.754078 (+   510us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:22.754250 (+   172us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:22.754313 (+    63us) server_negotiation.cc:300] Negotiation successful
0504 14:08:22.754418 (+   105us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":255,"thread_start_us":128,"threads_started":1}
I20260504 14:08:22.896165  1370 heartbeater.cc:499] Master 127.25.254.254:41049 was elected leader, sending a full tablet report...
I20260504 14:08:23.007131  1719 raft_consensus.cc:1217] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 1 LEARNER]: Deduplicated request from leader. Original: 1.6->[1.7-1.7]   Dedup: 1.7->[]
I20260504 14:08:23.007551  1748 sys_catalog.cc:455] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 8fc8681ad972459a8b49398127c6b42c. Latest consensus state: current_term: 1 leader_uuid: "8fc8681ad972459a8b49398127c6b42c" committed_config { opid_index: 7 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: NON_VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: true } } }
I20260504 14:08:23.007696  1748 sys_catalog.cc:458] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [sys.catalog]: This master's current role is: LEARNER
I20260504 14:08:23.397612  1474 sys_catalog.cc:455] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: SysCatalogTable state changed. Reason: Peer health change. Latest consensus state: current_term: 1 leader_uuid: "8fc8681ad972459a8b49398127c6b42c" committed_config { opid_index: 7 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: NON_VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: true } } }
I20260504 14:08:23.397817  1474 sys_catalog.cc:458] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:23.398826  1762 raft_consensus.cc:1064] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c: attempting to promote NON_VOTER b782998ffee94b86a80d60ed27ff59f1 to VOTER
I20260504 14:08:23.399242  1762 consensus_queue.cc:237] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 7, Committed index: 7, Last appended: 1.7, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: false } }
I20260504 14:08:23.400446  1719 raft_consensus.cc:1275] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 1 LEARNER]: Refusing update from remote peer 8fc8681ad972459a8b49398127c6b42c: Log matching property violated. Preceding OpId in replica: term: 1 index: 7. Preceding OpId from leader: term: 1 index: 8. (index mismatch)
I20260504 14:08:23.400451  1533 raft_consensus.cc:1275] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 1 FOLLOWER]: Refusing update from remote peer 8fc8681ad972459a8b49398127c6b42c: Log matching property violated. Preceding OpId in replica: term: 1 index: 7. Preceding OpId from leader: term: 1 index: 8. (index mismatch)
I20260504 14:08:23.400911  1762 consensus_queue.cc:1048] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [LEADER]: Connected to new peer: Peer: permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 8, Last known committed idx: 7, Time since last communication: 0.000s
I20260504 14:08:23.401237  1761 consensus_queue.cc:1048] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [LEADER]: Connected to new peer: Peer: permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: false }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 8, Last known committed idx: 7, Time since last communication: 0.000s
I20260504 14:08:23.403499  1761 raft_consensus.cc:2955] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [term 1 LEADER]: Committing config change with OpId 1.8: config changed from index 7 to 8, b782998ffee94b86a80d60ed27ff59f1 (127.25.254.252) changed from NON_VOTER to VOTER. New config: { opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: false } } }
I20260504 14:08:23.404904  1762 sys_catalog.cc:455] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: SysCatalogTable state changed. Reason: Config change replication complete. Latest consensus state: current_term: 1 leader_uuid: "8fc8681ad972459a8b49398127c6b42c" committed_config { opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: false } } }
I20260504 14:08:23.405031  1762 sys_catalog.cc:458] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:23.405053  1474 sys_catalog.cc:455] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: SysCatalogTable state changed. Reason: Peer health change. Latest consensus state: current_term: 1 leader_uuid: "8fc8681ad972459a8b49398127c6b42c" committed_config { opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: false } } }
I20260504 14:08:23.405133  1474 sys_catalog.cc:458] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:23.404825  1719 raft_consensus.cc:2955] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 1 FOLLOWER]: Committing config change with OpId 1.8: config changed from index 7 to 8, b782998ffee94b86a80d60ed27ff59f1 (127.25.254.252) changed from NON_VOTER to VOTER. New config: { opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: false } } }
I20260504 14:08:23.404922  1761 sys_catalog.cc:455] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: SysCatalogTable state changed. Reason: Peer health change. Latest consensus state: current_term: 1 leader_uuid: "8fc8681ad972459a8b49398127c6b42c" committed_config { opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: false } } }
I20260504 14:08:23.405416  1761 sys_catalog.cc:458] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:23.405469  1533 raft_consensus.cc:2955] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 1 FOLLOWER]: Committing config change with OpId 1.8: config changed from index 7 to 8, b782998ffee94b86a80d60ed27ff59f1 (127.25.254.252) changed from NON_VOTER to VOTER. New config: { opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: false } } }
I20260504 14:08:23.406430  1748 sys_catalog.cc:455] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [sys.catalog]: SysCatalogTable state changed. Reason: Replicated consensus-only round. Latest consensus state: current_term: 1 leader_uuid: "8fc8681ad972459a8b49398127c6b42c" committed_config { opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: false } } }
I20260504 14:08:23.406540  1748 sys_catalog.cc:458] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [sys.catalog]: This master's current role is: FOLLOWER
I20260504 14:08:23.406723  1766 sys_catalog.cc:455] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: SysCatalogTable state changed. Reason: Replicated consensus-only round. Latest consensus state: current_term: 1 leader_uuid: "8fc8681ad972459a8b49398127c6b42c" committed_config { opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: false } } }
I20260504 14:08:23.406849  1766 sys_catalog.cc:458] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: This master's current role is: FOLLOWER
I20260504 14:08:23.681975  1768 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.667797 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:42778 (local address 127.25.254.253:42377)
0504 14:08:23.668090 (+   293us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:23.668093 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:23.668108 (+    15us) server_negotiation.cc:408] Connection header received
0504 14:08:23.668146 (+    38us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:23.668149 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:23.668193 (+    44us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:23.668283 (+    90us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:23.669169 (+   886us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.669695 (+   526us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.670370 (+   675us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.670513 (+   143us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.677549 (+  7036us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:23.677574 (+    25us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:23.677577 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:23.677653 (+    76us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:23.679832 (+  2179us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:23.680469 (+   637us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:23.680475 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:23.680478 (+     3us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:23.680550 (+    72us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:23.680965 (+   415us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:23.680968 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:23.680970 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:23.681139 (+   169us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:23.681257 (+   118us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:23.681546 (+   289us) server_negotiation.cc:300] Negotiation successful
0504 14:08:23.681797 (+   251us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":215,"thread_start_us":86,"threads_started":1}
I20260504 14:08:23.685914  1769 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.683069 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.0.0.1:51750)
0504 14:08:23.683394 (+   325us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:23.683412 (+    18us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:23.683520 (+   108us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:23.683805 (+   285us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:23.683808 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:23.683827 (+    19us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:23.684046 (+   219us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:23.684052 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.684967 (+   915us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.684970 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:23.685552 (+   582us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:23.685558 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.685665 (+   107us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.685680 (+    15us) client_negotiation.cc:770] Sending connection context
0504 14:08:23.685718 (+    38us) client_negotiation.cc:241] Negotiation successful
0504 14:08:23.685768 (+    50us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":217,"thread_start_us":90,"threads_started":1}
I20260504 14:08:23.686388  1770 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.683224 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:51750 (local address 127.25.254.254:41049)
0504 14:08:23.683557 (+   333us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:23.683561 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:23.683573 (+    12us) server_negotiation.cc:408] Connection header received
0504 14:08:23.683626 (+    53us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:23.683629 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:23.683675 (+    46us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:23.683761 (+    86us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:23.684190 (+   429us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.684814 (+   624us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.685697 (+   883us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.686110 (+   413us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.686146 (+    36us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:23.686223 (+    77us) server_negotiation.cc:300] Negotiation successful
0504 14:08:23.686288 (+    65us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":247,"thread_start_us":130,"threads_started":1}
I20260504 14:08:23.690637  1769 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.687885 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.253:42377 (local address 127.0.0.1:42790)
0504 14:08:23.688059 (+   174us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:23.688079 (+    20us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:23.688167 (+    88us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:23.688413 (+   246us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:23.688416 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:23.688440 (+    24us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:23.688597 (+   157us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:23.688602 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.689313 (+   711us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.689317 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:23.690063 (+   746us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:23.690073 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.690304 (+   231us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.690322 (+    18us) client_negotiation.cc:770] Sending connection context
0504 14:08:23.690385 (+    63us) client_negotiation.cc:241] Negotiation successful
0504 14:08:23.690445 (+    60us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":65,"spinlock_wait_cycles":896}
I20260504 14:08:23.691092  1768 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.688014 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:42790 (local address 127.25.254.253:42377)
0504 14:08:23.688147 (+   133us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:23.688151 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:23.688163 (+    12us) server_negotiation.cc:408] Connection header received
0504 14:08:23.688283 (+   120us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:23.688288 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:23.688326 (+    38us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:23.688406 (+    80us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:23.688720 (+   314us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.689195 (+   475us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.690285 (+  1090us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.690856 (+   571us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.690899 (+    43us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:23.690950 (+    51us) server_negotiation.cc:300] Negotiation successful
0504 14:08:23.691013 (+    63us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":63}
I20260504 14:08:23.695379  1769 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.692499 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.252:44857 (local address 127.0.0.1:35230)
0504 14:08:23.692659 (+   160us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:23.692673 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:23.692743 (+    70us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:23.693246 (+   503us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:23.693249 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:23.693267 (+    18us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:23.693408 (+   141us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:23.693412 (+     4us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.694374 (+   962us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.694378 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:23.695048 (+   670us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:23.695056 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.695167 (+   111us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.695182 (+    15us) client_negotiation.cc:770] Sending connection context
0504 14:08:23.695236 (+    54us) client_negotiation.cc:241] Negotiation successful
0504 14:08:23.695283 (+    47us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":69}
I20260504 14:08:23.695914  1771 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.692698 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:35230 (local address 127.25.254.252:44857)
0504 14:08:23.693000 (+   302us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:23.693004 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:23.693016 (+    12us) server_negotiation.cc:408] Connection header received
0504 14:08:23.693061 (+    45us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:23.693063 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:23.693114 (+    51us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:23.693194 (+    80us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:23.693556 (+   362us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.694231 (+   675us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.695189 (+   958us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.695669 (+   480us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.695702 (+    33us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:23.695751 (+    49us) server_negotiation.cc:300] Negotiation successful
0504 14:08:23.695801 (+    50us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":236,"thread_start_us":160,"threads_started":1}
I20260504 14:08:23.705480  1771 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.697923 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:35236 (local address 127.25.254.252:44857)
0504 14:08:23.698047 (+   124us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:23.698051 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:23.698093 (+    42us) server_negotiation.cc:408] Connection header received
0504 14:08:23.698281 (+   188us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:23.698285 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:23.698334 (+    49us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:23.698423 (+    89us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:23.699196 (+   773us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.699945 (+   749us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.700605 (+   660us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.700788 (+   183us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.702323 (+  1535us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:23.702341 (+    18us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:23.702345 (+     4us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:23.702368 (+    23us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:23.704027 (+  1659us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:23.704428 (+   401us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:23.704431 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:23.704432 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:23.704470 (+    38us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:23.704726 (+   256us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:23.704728 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:23.704729 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:23.704857 (+   128us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:23.704962 (+   105us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:23.705264 (+   302us) server_negotiation.cc:300] Negotiation successful
0504 14:08:23.705359 (+    95us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":51}
I20260504 14:08:23.709363  1772 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.706671 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.0.0.1:51754)
0504 14:08:23.706943 (+   272us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:23.706955 (+    12us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:23.707031 (+    76us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:23.707364 (+   333us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:23.707367 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:23.707386 (+    19us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:23.707570 (+   184us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:23.707574 (+     4us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.708381 (+   807us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.708384 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:23.709055 (+   671us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:23.709063 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.709164 (+   101us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.709177 (+    13us) client_negotiation.cc:770] Sending connection context
0504 14:08:23.709215 (+    38us) client_negotiation.cc:241] Negotiation successful
0504 14:08:23.709260 (+    45us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":199,"thread_start_us":96,"threads_started":1}
I20260504 14:08:23.710136  1770 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.706778 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:51754 (local address 127.25.254.254:41049)
0504 14:08:23.706970 (+   192us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:23.706980 (+    10us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:23.707007 (+    27us) server_negotiation.cc:408] Connection header received
0504 14:08:23.707185 (+   178us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:23.707189 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:23.707238 (+    49us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:23.707318 (+    80us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:23.707694 (+   376us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.708262 (+   568us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.709194 (+   932us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.709798 (+   604us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.709942 (+   144us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:23.709991 (+    49us) server_negotiation.cc:300] Negotiation successful
0504 14:08:23.710039 (+    48us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":85}
I20260504 14:08:23.714373  1772 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.711743 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.253:42377 (local address 127.0.0.1:42804)
0504 14:08:23.711961 (+   218us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:23.711977 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:23.712072 (+    95us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:23.712350 (+   278us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:23.712353 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:23.712371 (+    18us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:23.712526 (+   155us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:23.712530 (+     4us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.713389 (+   859us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.713392 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:23.714005 (+   613us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:23.714013 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.714109 (+    96us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.714120 (+    11us) client_negotiation.cc:770] Sending connection context
0504 14:08:23.714226 (+   106us) client_negotiation.cc:241] Negotiation successful
0504 14:08:23.714283 (+    57us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":123}
I20260504 14:08:23.714912  1768 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.711975 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:42804 (local address 127.25.254.253:42377)
0504 14:08:23.712137 (+   162us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:23.712140 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:23.712154 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:08:23.712202 (+    48us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:23.712205 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:23.712248 (+    43us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:23.712328 (+    80us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:23.712703 (+   375us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.713242 (+   539us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.714139 (+   897us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.714699 (+   560us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.714731 (+    32us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:23.714775 (+    44us) server_negotiation.cc:300] Negotiation successful
0504 14:08:23.714817 (+    42us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":70}
I20260504 14:08:23.718763  1772 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.716303 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.252:44857 (local address 127.0.0.1:35248)
0504 14:08:23.716527 (+   224us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:23.716537 (+    10us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:23.716611 (+    74us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:23.716833 (+   222us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:23.716836 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:23.716850 (+    14us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:23.717007 (+   157us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:23.717011 (+     4us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.717729 (+   718us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.717732 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:23.718452 (+   720us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:23.718460 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.718567 (+   107us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.718579 (+    12us) client_negotiation.cc:770] Sending connection context
0504 14:08:23.718622 (+    43us) client_negotiation.cc:241] Negotiation successful
0504 14:08:23.718667 (+    45us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":169}
I20260504 14:08:23.719282  1771 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.716424 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:35248 (local address 127.25.254.252:44857)
0504 14:08:23.716538 (+   114us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:23.716542 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:23.716578 (+    36us) server_negotiation.cc:408] Connection header received
0504 14:08:23.716691 (+   113us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:23.716694 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:23.716733 (+    39us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:23.716803 (+    70us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:23.717133 (+   330us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.717583 (+   450us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.718600 (+  1017us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.719105 (+   505us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.719132 (+    27us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:23.719172 (+    40us) server_negotiation.cc:300] Negotiation successful
0504 14:08:23.719207 (+    35us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":47}
I20260504 14:08:23.720822 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 969
I20260504 14:08:23.727356 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:46541
--local_ip_for_outbound_sockets=127.25.254.193
--tserver_master_addrs=127.25.254.254:41049
--webserver_port=42399
--webserver_interface=127.25.254.193
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.193
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:37991
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--tserver_master_addrs=127.25.254.254:41049,127.25.254.253:42377
--tserver_master_addrs=127.25.254.254:41049,127.25.254.253:42377,127.25.254.252:44857 with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:23.843116  1773 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:23.843371  1773 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:23.843490  1773 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:23.847174  1773 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:23.847251  1773 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:23.847379  1773 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:08:23.852301  1773 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:37991
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.193
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:46541
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=42399
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:41049,127.25.254.253:42377,127.25.254.252:44857
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.1773
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:23.853605  1773 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:23.854714  1773 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:23.862946  1781 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:23.863191  1773 server_base.cc:1061] running on GCE node
W20260504 14:08:23.863125  1778 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:23.863131  1779 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:23.863808  1773 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:23.864490  1773 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:23.865700  1773 hybrid_clock.cc:648] HybridClock initialized: now 1777903703865672 us; error 44 us; skew 500 ppm
May 04 14:08:23 dist-test-slave-2x32 krb5kdc[857](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903703, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.193@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:23.869470  1773 init.cc:377] Logged in from keytab as oryx/127.25.254.193@KRBTEST.COM (short username oryx)
I20260504 14:08:23.871243  1773 webserver.cc:492] Webserver started at http://127.25.254.193:42399/ using document root <none> and password file <none>
I20260504 14:08:23.871876  1773 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:23.872002  1773 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:23.876375  1773 fs_manager.cc:714] Time spent opening directory manager: real 0.003s	user 0.004s	sys 0.000s
I20260504 14:08:23.878945  1788 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:23.880349  1773 fs_manager.cc:730] Time spent opening block manager: real 0.003s	user 0.003s	sys 0.000s
I20260504 14:08:23.880479  1773 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "3162d5c41c52485598abe7a0a0eec507"
format_stamp: "Formatted at 2026-05-04 14:08:21 on dist-test-slave-2x32"
server_key: "b30318173ec89571540a97b067bf3d12"
server_key_iv: "6108ebe4d0b67d73b48667b516f82339"
server_key_version: "encryptionkey@0"
I20260504 14:08:23.880931  1773 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:23.904103  1773 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:23.907354  1773 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:23.907567  1773 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:23.908155  1773 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:23.909178  1773 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:23.909253  1773 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:23.909323  1773 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:23.909376  1773 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:23.920115  1773 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:46541
I20260504 14:08:23.920231  1901 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:46541 every 8 connection(s)
I20260504 14:08:23.921218  1773 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
I20260504 14:08:23.924718 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 1773
I20260504 14:08:23.924952 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 1105
May 04 14:08:23 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903703, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM
May 04 14:08:23 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903703, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.253@KRBTEST.COM
May 04 14:08:23 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903703, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.252@KRBTEST.COM
I20260504 14:08:23.933800 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:44781
--local_ip_for_outbound_sockets=127.25.254.194
--tserver_master_addrs=127.25.254.254:41049
--webserver_port=45723
--webserver_interface=127.25.254.194
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.194
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:37991
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--tserver_master_addrs=127.25.254.254:41049,127.25.254.253:42377
--tserver_master_addrs=127.25.254.254:41049,127.25.254.253:42377,127.25.254.252:44857 with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
I20260504 14:08:23.938848  1771 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.925200 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:54373 (local address 127.25.254.252:44857)
0504 14:08:23.925366 (+   166us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:23.925370 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:23.925769 (+   399us) server_negotiation.cc:408] Connection header received
0504 14:08:23.925944 (+   175us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:23.925949 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:23.926002 (+    53us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:23.926058 (+    56us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:23.928047 (+  1989us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.928741 (+   694us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.929489 (+   748us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.929654 (+   165us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.934463 (+  4809us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:23.934482 (+    19us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:23.934485 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:23.934540 (+    55us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:23.936744 (+  2204us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:23.937289 (+   545us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:23.937295 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:23.937300 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:23.937356 (+    56us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:23.937721 (+   365us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:23.937727 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:23.937732 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:23.937923 (+   191us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:23.938003 (+    80us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:23.938458 (+   455us) server_negotiation.cc:300] Negotiation successful
0504 14:08:23.938599 (+   141us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":71}
I20260504 14:08:23.938944  1770 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.924559 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:41523 (local address 127.25.254.254:41049)
0504 14:08:23.924685 (+   126us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:23.924689 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:23.925111 (+   422us) server_negotiation.cc:408] Connection header received
0504 14:08:23.925524 (+   413us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:23.925530 (+     6us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:23.925598 (+    68us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:23.925698 (+   100us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:23.927477 (+  1779us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.928367 (+   890us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.929362 (+   995us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.929595 (+   233us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.934193 (+  4598us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:23.934219 (+    26us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:23.934225 (+     6us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:23.934265 (+    40us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:23.937199 (+  2934us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:23.937646 (+   447us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:23.937653 (+     7us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:23.937658 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:23.937786 (+   128us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:23.938054 (+   268us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:23.938057 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:23.938059 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:23.938250 (+   191us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:23.938327 (+    77us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:23.938645 (+   318us) server_negotiation.cc:300] Negotiation successful
0504 14:08:23.938793 (+   148us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":51}
I20260504 14:08:23.939502  1906 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.923454 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.253:42377 (local address 127.25.254.193:49907)
0504 14:08:23.923969 (+   515us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:23.924001 (+    32us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:23.924975 (+   974us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:23.925774 (+   799us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:23.925788 (+    14us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:23.927221 (+  1433us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:23.927539 (+   318us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:23.927550 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.928599 (+  1049us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.928602 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:23.929195 (+   593us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:23.929203 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.929754 (+   551us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.930775 (+  1021us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:23.930817 (+    42us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:23.933739 (+  2922us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:23.936355 (+  2616us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:23.936364 (+     9us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:23.936380 (+    16us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:23.936771 (+   391us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:23.937105 (+   334us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:23.937109 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:23.937114 (+     5us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:23.937256 (+   142us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:23.937762 (+   506us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:23.937768 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:23.938119 (+   351us) client_negotiation.cc:770] Sending connection context
0504 14:08:23.938381 (+   262us) client_negotiation.cc:241] Negotiation successful
0504 14:08:23.938645 (+   264us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":349,"spinlock_wait_cycles":26240,"thread_start_us":166,"threads_started":1}
I20260504 14:08:23.939509  1907 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.924392 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.193:41523)
0504 14:08:23.924973 (+   581us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:23.924988 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:23.925187 (+   199us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:23.925832 (+   645us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:23.925838 (+     6us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:23.926607 (+   769us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:23.927286 (+   679us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:23.927305 (+    19us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.928533 (+  1228us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.928539 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:23.929218 (+   679us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:23.929230 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.929754 (+   524us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.930831 (+  1077us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:23.930854 (+    23us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:23.934009 (+  3155us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:23.937322 (+  3313us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:23.937325 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:23.937328 (+     3us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:23.937522 (+   194us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:23.937899 (+   377us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:23.937902 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:23.937904 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:23.937955 (+    51us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:23.938394 (+   439us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:23.938398 (+     4us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:23.938466 (+    68us) client_negotiation.cc:770] Sending connection context
0504 14:08:23.939083 (+   617us) client_negotiation.cc:241] Negotiation successful
0504 14:08:23.939208 (+   125us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":359,"spinlock_wait_cycles":9472,"thread_start_us":139,"threads_started":1}
I20260504 14:08:23.940263  1768 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.923144 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:49907 (local address 127.25.254.253:42377)
0504 14:08:23.923311 (+   167us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:23.923316 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:23.924110 (+   794us) server_negotiation.cc:408] Connection header received
0504 14:08:23.925211 (+  1101us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:23.925218 (+     7us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:23.925283 (+    65us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:23.925339 (+    56us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:23.927726 (+  2387us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.928466 (+   740us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.929325 (+   859us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.929521 (+   196us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.934016 (+  4495us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:23.934040 (+    24us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:23.934044 (+     4us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:23.934077 (+    33us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:23.936180 (+  2103us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:23.936911 (+   731us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:23.936915 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:23.936917 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:23.936991 (+    74us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:23.937362 (+   371us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:23.937368 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:23.937373 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:23.937614 (+   241us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:23.937704 (+    90us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:23.940044 (+  2340us) server_negotiation.cc:300] Negotiation successful
0504 14:08:23.940153 (+   109us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":62}
I20260504 14:08:23.940989  1902 heartbeater.cc:344] Connected to a master server at 127.25.254.254:41049
I20260504 14:08:23.941210  1902 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:23.941215  1904 heartbeater.cc:344] Connected to a master server at 127.25.254.253:42377
I20260504 14:08:23.941327  1904 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:23.941759  1902 heartbeater.cc:507] Master 127.25.254.254:41049 requested a full tablet report, sending...
I20260504 14:08:23.941761  1904 heartbeater.cc:507] Master 127.25.254.253:42377 requested a full tablet report, sending...
I20260504 14:08:23.942866  1908 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:23.924835 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.252:44857 (local address 127.25.254.193:54373)
0504 14:08:23.925698 (+   863us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:23.925711 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:23.925825 (+   114us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:23.926223 (+   398us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:23.926228 (+     5us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:23.927608 (+  1380us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:23.927915 (+   307us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:23.927922 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.928862 (+   940us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:23.928865 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:23.929372 (+   507us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:23.929379 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:23.929754 (+   375us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:23.930810 (+  1056us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:23.930824 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:23.934262 (+  3438us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:23.936879 (+  2617us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:23.936883 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:23.936885 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:23.937162 (+   277us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:23.937469 (+   307us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:23.937474 (+     5us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:23.937479 (+     5us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:23.937540 (+    61us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:23.938077 (+   537us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:23.938083 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:23.938201 (+   118us) client_negotiation.cc:770] Sending connection context
0504 14:08:23.942290 (+  4089us) client_negotiation.cc:241] Negotiation successful
0504 14:08:23.942701 (+   411us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":800,"spinlock_wait_cycles":11520,"thread_start_us":94,"threads_started":1}
I20260504 14:08:23.943179   905 ts_manager.cc:194] Re-registered known tserver with Master: 3162d5c41c52485598abe7a0a0eec507 (127.25.254.193:46541)
I20260504 14:08:23.944061   905 master_service.cc:502] Signed X509 certificate for tserver {username='oryx', principal='oryx/127.25.254.193@KRBTEST.COM'} at 127.25.254.193:41523
I20260504 14:08:23.945583  1523 ts_manager.cc:194] Registered new tserver with Master: 3162d5c41c52485598abe7a0a0eec507 (127.25.254.193:46541)
I20260504 14:08:23.946336  1903 heartbeater.cc:344] Connected to a master server at 127.25.254.252:44857
I20260504 14:08:23.946437  1903 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:23.946683  1903 heartbeater.cc:507] Master 127.25.254.252:44857 requested a full tablet report, sending...
I20260504 14:08:23.947921  1709 ts_manager.cc:194] Registered new tserver with Master: 3162d5c41c52485598abe7a0a0eec507 (127.25.254.193:46541)
W20260504 14:08:24.051040  1909 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:24.051287  1909 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:24.051390  1909 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:24.055015  1909 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:24.055090  1909 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:24.055223  1909 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:08:24.059955  1909 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:37991
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.194
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:44781
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=45723
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:41049,127.25.254.253:42377,127.25.254.252:44857
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.1909
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:24.061115  1909 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:24.061990  1909 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:24.068926  1915 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:24.068943  1918 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:24.069027  1909 server_base.cc:1061] running on GCE node
W20260504 14:08:24.068926  1916 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:24.069550  1909 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:24.070219  1909 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:24.071416  1909 hybrid_clock.cc:648] HybridClock initialized: now 1777903704071401 us; error 25 us; skew 500 ppm
May 04 14:08:24 dist-test-slave-2x32 krb5kdc[857](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903704, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.194@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:24.074311  1909 init.cc:377] Logged in from keytab as oryx/127.25.254.194@KRBTEST.COM (short username oryx)
I20260504 14:08:24.075508  1909 webserver.cc:492] Webserver started at http://127.25.254.194:45723/ using document root <none> and password file <none>
I20260504 14:08:24.076107  1909 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:24.076198  1909 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:24.080585  1909 fs_manager.cc:714] Time spent opening directory manager: real 0.003s	user 0.004s	sys 0.000s
I20260504 14:08:24.082885  1925 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:24.084151  1909 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:08:24.084270  1909 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "455287b19bad4de4a0b2b1c878c6b1d0"
format_stamp: "Formatted at 2026-05-04 14:08:21 on dist-test-slave-2x32"
server_key: "132a133de3102d7ab024268479044fb8"
server_key_iv: "da992cdf2236ad34ac26c572d1f7c401"
server_key_version: "encryptionkey@0"
I20260504 14:08:24.084769  1909 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:24.115484  1909 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:24.118487  1909 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:24.118672  1909 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:24.119243  1909 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:24.120213  1909 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:24.120266  1909 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:24.120311  1909 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:24.120327  1909 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:24.132117  1909 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:44781
I20260504 14:08:24.132134  2038 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:44781 every 8 connection(s)
I20260504 14:08:24.133210  1909 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
May 04 14:08:24 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903704, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM
May 04 14:08:24 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903704, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.253@KRBTEST.COM
I20260504 14:08:24.142966 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 1909
I20260504 14:08:24.143148 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 1241
May 04 14:08:24 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903704, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.252@KRBTEST.COM
I20260504 14:08:24.147791  1768 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.135118 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:46079 (local address 127.25.254.253:42377)
0504 14:08:24.135244 (+   126us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:24.135248 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:24.135951 (+   703us) server_negotiation.cc:408] Connection header received
0504 14:08:24.137048 (+  1097us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:24.137052 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:24.137115 (+    63us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:24.137201 (+    86us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:24.138675 (+  1474us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.139431 (+   756us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.140217 (+   786us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.140418 (+   201us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.143187 (+  2769us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:24.143207 (+    20us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:24.143213 (+     6us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:24.143248 (+    35us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:24.145335 (+  2087us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.146000 (+   665us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.146004 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.146007 (+     3us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.146062 (+    55us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.146434 (+   372us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.146438 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.146440 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.146655 (+   215us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:24.146731 (+    76us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:24.147476 (+   745us) server_negotiation.cc:300] Negotiation successful
0504 14:08:24.147601 (+   125us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":50}
I20260504 14:08:24.148001  1771 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.135023 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:46579 (local address 127.25.254.252:44857)
0504 14:08:24.135169 (+   146us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:24.135174 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:24.135956 (+   782us) server_negotiation.cc:408] Connection header received
0504 14:08:24.137148 (+  1192us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:24.137155 (+     7us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:24.137213 (+    58us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:24.137289 (+    76us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:24.138853 (+  1564us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.139337 (+   484us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.140441 (+  1104us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.141292 (+   851us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.143690 (+  2398us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:24.143708 (+    18us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:24.143710 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:24.143763 (+    53us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:24.145981 (+  2218us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.146435 (+   454us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.146438 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.146440 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.146483 (+    43us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.146760 (+   277us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.146763 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.146765 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.146929 (+   164us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:24.147012 (+    83us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:24.147751 (+   739us) server_negotiation.cc:300] Negotiation successful
0504 14:08:24.147861 (+   110us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":59}
I20260504 14:08:24.148870  2044 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.135352 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.253:42377 (local address 127.25.254.194:46079)
0504 14:08:24.135803 (+   451us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:24.135843 (+    40us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:24.136859 (+  1016us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:24.137386 (+   527us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:24.137397 (+    11us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:24.137999 (+   602us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:24.138503 (+   504us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.138511 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.139529 (+  1018us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.139532 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:24.140080 (+   548us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.140087 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.140547 (+   460us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.141133 (+   586us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:24.141152 (+    19us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:24.143037 (+  1885us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:24.145495 (+  2458us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:24.145503 (+     8us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:24.145521 (+    18us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:24.145864 (+   343us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:24.146197 (+   333us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:24.146200 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:24.146202 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:24.146322 (+   120us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:24.146864 (+   542us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:24.146870 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:24.147179 (+   309us) client_negotiation.cc:770] Sending connection context
0504 14:08:24.148228 (+  1049us) client_negotiation.cc:241] Negotiation successful
0504 14:08:24.148330 (+   102us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":340,"mutex_wait_us":24,"spinlock_wait_cycles":80512,"thread_start_us":155,"threads_started":1}
I20260504 14:08:24.148890  2043 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.135352 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.252:44857 (local address 127.25.254.194:46579)
0504 14:08:24.135803 (+   451us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:24.135843 (+    40us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:24.136847 (+  1004us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:24.137386 (+   539us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:24.137397 (+    11us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:24.138457 (+  1060us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:24.138737 (+   280us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.138743 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.139463 (+   720us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.139467 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:24.140080 (+   613us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.140087 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.140213 (+   126us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.140903 (+   690us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:24.140924 (+    21us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:24.143539 (+  2615us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:24.146095 (+  2556us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:24.146100 (+     5us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:24.146105 (+     5us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:24.146332 (+   227us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:24.146595 (+   263us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:24.146598 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:24.146600 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:24.146660 (+    60us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:24.147050 (+   390us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:24.147052 (+     2us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:24.147179 (+   127us) client_negotiation.cc:770] Sending connection context
0504 14:08:24.147409 (+   230us) client_negotiation.cc:241] Negotiation successful
0504 14:08:24.147703 (+   294us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":276,"spinlock_wait_cycles":33280,"thread_start_us":106,"threads_started":1}
I20260504 14:08:24.150404  2040 heartbeater.cc:344] Connected to a master server at 127.25.254.252:44857
I20260504 14:08:24.150540  1770 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.136449 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:49341 (local address 127.25.254.254:41049)
0504 14:08:24.136582 (+   133us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:24.136587 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:24.136635 (+    48us) server_negotiation.cc:408] Connection header received
0504 14:08:24.137338 (+   703us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:24.137345 (+     7us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:24.137412 (+    67us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:24.137508 (+    96us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:24.138625 (+  1117us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.139441 (+   816us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.140110 (+   669us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.140296 (+   186us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.142936 (+  2640us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:24.142959 (+    23us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:24.142963 (+     4us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:24.142998 (+    35us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:24.147277 (+  4279us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.149142 (+  1865us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.149149 (+     7us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.149154 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.149216 (+    62us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.149489 (+   273us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.149495 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.149500 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.149686 (+   186us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:24.149765 (+    79us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:24.150171 (+   406us) server_negotiation.cc:300] Negotiation successful
0504 14:08:24.150314 (+   143us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":58}
I20260504 14:08:24.150707  2040 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:24.151116  2045 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.136294 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.194:49341)
0504 14:08:24.136565 (+   271us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:24.136580 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:24.136861 (+   281us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:24.137529 (+   668us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:24.137532 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:24.137944 (+   412us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:24.138474 (+   530us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.138485 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.139599 (+  1114us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.139602 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:24.139984 (+   382us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.139990 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.140212 (+   222us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.140979 (+   767us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:24.141002 (+    23us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:24.142651 (+  1649us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:24.147420 (+  4769us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:24.147426 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:24.147432 (+     6us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:24.149009 (+  1577us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:24.149326 (+   317us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:24.149332 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:24.149337 (+     5us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:24.149397 (+    60us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:24.149839 (+   442us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:24.149845 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:24.149948 (+   103us) client_negotiation.cc:770] Sending connection context
0504 14:08:24.150696 (+   748us) client_negotiation.cc:241] Negotiation successful
0504 14:08:24.150840 (+   144us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":198,"spinlock_wait_cycles":4992,"thread_start_us":89,"threads_started":1}
I20260504 14:08:24.151292  2040 heartbeater.cc:507] Master 127.25.254.252:44857 requested a full tablet report, sending...
I20260504 14:08:24.151572  2041 heartbeater.cc:344] Connected to a master server at 127.25.254.253:42377
I20260504 14:08:24.151804  2041 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:24.152009  2041 heartbeater.cc:507] Master 127.25.254.253:42377 requested a full tablet report, sending...
I20260504 14:08:24.152055 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:37043
--local_ip_for_outbound_sockets=127.25.254.195
--tserver_master_addrs=127.25.254.254:41049
--webserver_port=36455
--webserver_interface=127.25.254.195
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.195
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:37991
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--tserver_master_addrs=127.25.254.254:41049,127.25.254.253:42377
--tserver_master_addrs=127.25.254.254:41049,127.25.254.253:42377,127.25.254.252:44857 with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
I20260504 14:08:24.152509  2039 heartbeater.cc:344] Connected to a master server at 127.25.254.254:41049
I20260504 14:08:24.152518  1709 ts_manager.cc:194] Registered new tserver with Master: 455287b19bad4de4a0b2b1c878c6b1d0 (127.25.254.194:44781)
I20260504 14:08:24.152594  2039 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:24.152746  1523 ts_manager.cc:194] Registered new tserver with Master: 455287b19bad4de4a0b2b1c878c6b1d0 (127.25.254.194:44781)
I20260504 14:08:24.152815  2039 heartbeater.cc:507] Master 127.25.254.254:41049 requested a full tablet report, sending...
I20260504 14:08:24.153471   905 ts_manager.cc:194] Re-registered known tserver with Master: 455287b19bad4de4a0b2b1c878c6b1d0 (127.25.254.194:44781)
I20260504 14:08:24.154033   905 master_service.cc:502] Signed X509 certificate for tserver {username='oryx', principal='oryx/127.25.254.194@KRBTEST.COM'} at 127.25.254.194:49341
W20260504 14:08:24.260152  2046 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:24.260406  2046 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:24.260468  2046 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:24.264070  2046 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:24.264153  2046 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:24.264240  2046 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:08:24.268941  2046 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:37991
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.195
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:37043
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=36455
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:41049,127.25.254.253:42377,127.25.254.252:44857
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.2046
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:24.270015  2046 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:24.270952  2046 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:24.278837  2051 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:24.278795  2052 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:24.278755  2054 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:24.279281  2046 server_base.cc:1061] running on GCE node
I20260504 14:08:24.279767  2046 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:24.280498  2046 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:24.283880  2046 hybrid_clock.cc:648] HybridClock initialized: now 1777903704283835 us; error 56 us; skew 500 ppm
May 04 14:08:24 dist-test-slave-2x32 krb5kdc[857](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903704, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.195@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:24.287432  2046 init.cc:377] Logged in from keytab as oryx/127.25.254.195@KRBTEST.COM (short username oryx)
I20260504 14:08:24.288654  2046 webserver.cc:492] Webserver started at http://127.25.254.195:36455/ using document root <none> and password file <none>
I20260504 14:08:24.289256  2046 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:24.289346  2046 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:24.293154  2046 fs_manager.cc:714] Time spent opening directory manager: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:08:24.295331  2061 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:24.296478  2046 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:08:24.296634  2046 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "e7715d2595ab45af9b8b972e34e95e10"
format_stamp: "Formatted at 2026-05-04 14:08:21 on dist-test-slave-2x32"
server_key: "d42f70ebf68a15e9a40c1dc4a50a1c61"
server_key_iv: "ab7bcfb17ab0e99ea446a35810d43aff"
server_key_version: "encryptionkey@0"
I20260504 14:08:24.297070  2046 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:24.328902  2046 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:24.332371  2046 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:24.332592  2046 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:24.333214  2046 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:24.334219  2046 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:24.334293  2046 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:24.334365  2046 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:24.334406  2046 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:24.344467  2046 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:37043
I20260504 14:08:24.344504  2174 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:37043 every 8 connection(s)
I20260504 14:08:24.345739  2046 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
I20260504 14:08:24.349121 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 2046
May 04 14:08:24 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903704, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM
May 04 14:08:24 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903704, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.253@KRBTEST.COM
May 04 14:08:24 dist-test-slave-2x32 krb5kdc[857](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903704, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.252@KRBTEST.COM
I20260504 14:08:24.365933  1768 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.347795 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:57377 (local address 127.25.254.253:42377)
0504 14:08:24.348057 (+   262us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:24.348062 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:24.348908 (+   846us) server_negotiation.cc:408] Connection header received
0504 14:08:24.349950 (+  1042us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:24.349954 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:24.350010 (+    56us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:24.350087 (+    77us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:24.353077 (+  2990us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.353827 (+   750us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.355425 (+  1598us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.355610 (+   185us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.358223 (+  2613us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:24.358243 (+    20us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:24.358246 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:24.358274 (+    28us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:24.362640 (+  4366us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.364316 (+  1676us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.364322 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.364323 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.364376 (+    53us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.364665 (+   289us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.364669 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.364671 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.364835 (+   164us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:24.364913 (+    78us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:24.365645 (+   732us) server_negotiation.cc:300] Negotiation successful
0504 14:08:24.365767 (+   122us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":174}
I20260504 14:08:24.366767  2180 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.348039 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.253:42377 (local address 127.25.254.195:57377)
0504 14:08:24.348768 (+   729us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:24.348810 (+    42us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:24.349740 (+   930us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:24.350434 (+   694us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:24.350445 (+    11us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:24.351045 (+   600us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:24.351727 (+   682us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.351741 (+    14us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.353972 (+  2231us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.353976 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:24.355291 (+  1315us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.355301 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.355738 (+   437us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.356228 (+   490us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:24.356243 (+    15us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:24.357995 (+  1752us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:24.363937 (+  5942us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:24.363943 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:24.363948 (+     5us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:24.364199 (+   251us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:24.364491 (+   292us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:24.364494 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:24.364496 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:24.364549 (+    53us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:24.364997 (+   448us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:24.365004 (+     7us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:24.365317 (+   313us) client_negotiation.cc:770] Sending connection context
0504 14:08:24.366393 (+  1076us) client_negotiation.cc:241] Negotiation successful
0504 14:08:24.366516 (+   123us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":526,"mutex_wait_us":318,"spinlock_wait_cycles":14336,"thread_start_us":117,"threads_started":1}
I20260504 14:08:24.367184  1770 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.347926 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:59123 (local address 127.25.254.254:41049)
0504 14:08:24.348149 (+   223us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:24.348152 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:24.350280 (+  2128us) server_negotiation.cc:408] Connection header received
0504 14:08:24.350454 (+   174us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:24.350458 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:24.350509 (+    51us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:24.350560 (+    51us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:24.351904 (+  1344us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.352720 (+   816us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.354451 (+  1731us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.354647 (+   196us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.358224 (+  3577us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:24.358243 (+    19us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:24.358247 (+     4us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:24.358274 (+    27us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:24.359982 (+  1708us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.364189 (+  4207us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.364194 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.364195 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.364254 (+    59us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.364658 (+   404us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.364661 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.364663 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.364835 (+   172us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:24.364940 (+   105us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:24.366915 (+  1975us) server_negotiation.cc:300] Negotiation successful
0504 14:08:24.367041 (+   126us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":151}
I20260504 14:08:24.366767  2179 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.348039 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.195:59123)
0504 14:08:24.350163 (+  2124us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:24.350207 (+    44us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:24.350349 (+   142us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:24.350708 (+   359us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:24.350713 (+     5us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:24.351508 (+   795us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:24.351741 (+   233us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.351747 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.353727 (+  1980us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.353730 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:24.354342 (+   612us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.354349 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.354561 (+   212us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.355196 (+   635us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:24.355223 (+    27us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:24.357995 (+  2772us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:24.363339 (+  5344us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:24.363350 (+    11us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:24.363368 (+    18us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:24.363794 (+   426us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:24.364360 (+   566us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:24.364364 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:24.364366 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:24.364556 (+   190us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:24.364997 (+   441us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:24.365004 (+     7us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:24.365317 (+   313us) client_negotiation.cc:770] Sending connection context
0504 14:08:24.365563 (+   246us) client_negotiation.cc:241] Negotiation successful
0504 14:08:24.365812 (+   249us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":299,"spinlock_wait_cycles":21632,"thread_start_us":121,"threads_started":1}
I20260504 14:08:24.367393  1771 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.349399 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:52851 (local address 127.25.254.252:44857)
0504 14:08:24.350316 (+   917us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:24.350321 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:24.353448 (+  3127us) server_negotiation.cc:408] Connection header received
0504 14:08:24.353595 (+   147us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:24.353599 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:24.353658 (+    59us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:24.353735 (+    77us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:24.354485 (+   750us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.355175 (+   690us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.357823 (+  2648us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.358021 (+   198us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.360163 (+  2142us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:24.360183 (+    20us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:24.360186 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:24.360214 (+    28us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:24.363173 (+  2959us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.365233 (+  2060us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.365236 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.365238 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.365277 (+    39us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.365584 (+   307us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.365589 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.365594 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.365749 (+   155us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:24.365840 (+    91us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:24.367096 (+  1256us) server_negotiation.cc:300] Negotiation successful
0504 14:08:24.367244 (+   148us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":818}
I20260504 14:08:24.367919  2181 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.349050 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.252:44857 (local address 127.25.254.195:52851)
0504 14:08:24.353355 (+  4305us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:24.353367 (+    12us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:24.353489 (+   122us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:24.353830 (+   341us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:24.353833 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:24.354035 (+   202us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:24.354342 (+   307us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.354349 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.355316 (+   967us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.355320 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:24.355851 (+   531us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.355861 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.358132 (+  2271us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.358620 (+   488us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:24.358636 (+    16us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:24.360035 (+  1399us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:24.363799 (+  3764us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:24.363803 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:24.363805 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:24.364870 (+  1065us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:24.365389 (+   519us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:24.365394 (+     5us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:24.365399 (+     5us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:24.365473 (+    74us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:24.365909 (+   436us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:24.365914 (+     5us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:24.366010 (+    96us) client_negotiation.cc:770] Sending connection context
0504 14:08:24.367576 (+  1566us) client_negotiation.cc:241] Negotiation successful
0504 14:08:24.367708 (+   132us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":4242,"thread_start_us":88,"threads_started":1}
I20260504 14:08:24.368623  2177 heartbeater.cc:344] Connected to a master server at 127.25.254.253:42377
I20260504 14:08:24.368906  2177 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:24.369505  2177 heartbeater.cc:507] Master 127.25.254.253:42377 requested a full tablet report, sending...
I20260504 14:08:24.369910  2189 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.352906 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:51766 (local address 127.25.254.254:41049)
0504 14:08:24.354967 (+  2061us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:24.354971 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:24.356488 (+  1517us) server_negotiation.cc:408] Connection header received
0504 14:08:24.356536 (+    48us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:24.356539 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:24.356590 (+    51us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:24.356743 (+   153us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:24.357898 (+  1155us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.358676 (+   778us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.360708 (+  2032us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.361053 (+   345us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.362066 (+  1013us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:24.362092 (+    26us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:24.362097 (+     5us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:24.362132 (+    35us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:24.368047 (+  5915us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.368619 (+   572us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.368623 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.368625 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.368693 (+    68us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.369027 (+   334us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.369032 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.369033 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.369226 (+   193us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:24.369363 (+   137us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:24.369595 (+   232us) server_negotiation.cc:300] Negotiation successful
0504 14:08:24.369739 (+   144us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":1976,"thread_start_us":68,"threads_started":1}
I20260504 14:08:24.370509  2175 heartbeater.cc:344] Connected to a master server at 127.25.254.254:41049
I20260504 14:08:24.370591  2175 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:24.370684  2188 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.352698 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:35264 (local address 127.25.254.252:44857)
0504 14:08:24.359798 (+  7100us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:24.359802 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:24.359814 (+    12us) server_negotiation.cc:408] Connection header received
0504 14:08:24.359850 (+    36us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:24.359853 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:24.359900 (+    47us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:24.361372 (+  1472us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:24.361488 (+   116us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.362739 (+  1251us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.363805 (+  1066us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.365071 (+  1266us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.365169 (+    98us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:24.365190 (+    21us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:24.365196 (+     6us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:24.365223 (+    27us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:24.368715 (+  3492us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.369202 (+   487us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.369206 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.369208 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.369251 (+    43us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.369527 (+   276us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.369530 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.369533 (+     3us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.369685 (+   152us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:24.370282 (+   597us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:24.370363 (+    81us) server_negotiation.cc:300] Negotiation successful
0504 14:08:24.370488 (+   125us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":7001,"thread_start_us":70,"threads_started":1}
I20260504 14:08:24.370771  2175 heartbeater.cc:507] Master 127.25.254.254:41049 requested a full tablet report, sending...
I20260504 14:08:24.371816   905 ts_manager.cc:194] Re-registered known tserver with Master: e7715d2595ab45af9b8b972e34e95e10 (127.25.254.195:37043)
I20260504 14:08:24.372339   905 master_service.cc:502] Signed X509 certificate for tserver {username='oryx', principal='oryx/127.25.254.195@KRBTEST.COM'} at 127.25.254.195:59123
I20260504 14:08:24.372731  2176 heartbeater.cc:344] Connected to a master server at 127.25.254.252:44857
I20260504 14:08:24.372807  2176 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:24.372998  2176 heartbeater.cc:507] Master 127.25.254.252:44857 requested a full tablet report, sending...
I20260504 14:08:24.373554  1523 ts_manager.cc:194] Registered new tserver with Master: e7715d2595ab45af9b8b972e34e95e10 (127.25.254.195:37043)
I20260504 14:08:24.373641  1709 ts_manager.cc:194] Registered new tserver with Master: e7715d2595ab45af9b8b972e34e95e10 (127.25.254.195:37043)
I20260504 14:08:24.372881   905 catalog_manager.cc:2257] Servicing CreateTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:51766:
name: "test-table"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "val"
    type: INT32
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
I20260504 14:08:24.375571  2191 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.353042 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:42816 (local address 127.25.254.253:42377)
0504 14:08:24.359433 (+  6391us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:24.359437 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:24.359450 (+    13us) server_negotiation.cc:408] Connection header received
0504 14:08:24.359493 (+    43us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:24.359498 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:24.359544 (+    46us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:24.361231 (+  1687us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:24.362193 (+   962us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.362930 (+   737us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.364114 (+  1184us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.366351 (+  2237us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.366454 (+   103us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:24.366472 (+    18us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:24.366477 (+     5us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:24.366499 (+    22us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:24.370937 (+  4438us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.371392 (+   455us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.371396 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.371397 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.371445 (+    48us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.371959 (+   514us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.371964 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.371967 (+     3us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.374274 (+  2307us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:24.374425 (+   151us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:24.375026 (+   601us) server_negotiation.cc:300] Negotiation successful
0504 14:08:24.375157 (+   131us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":6289,"thread_start_us":62,"threads_started":1}
W20260504 14:08:24.375680   905 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-table in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260504 14:08:24.392419  2202 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.388732 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:46541 (local address 127.0.0.1:50226)
0504 14:08:24.389318 (+   586us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:24.389331 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:24.389408 (+    77us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:24.389917 (+   509us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:24.389920 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:24.389963 (+    43us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:24.390261 (+   298us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.390270 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.391400 (+  1130us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.391405 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:24.392006 (+   601us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.392012 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.392143 (+   131us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.392161 (+    18us) client_negotiation.cc:770] Sending connection context
0504 14:08:24.392212 (+    51us) client_negotiation.cc:241] Negotiation successful
0504 14:08:24.392274 (+    62us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":487,"thread_start_us":126,"threads_started":1}
I20260504 14:08:24.393536  2201 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.388338 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:37043 (local address 127.0.0.1:37348)
0504 14:08:24.389049 (+   711us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:24.389066 (+    17us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:24.389162 (+    96us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:24.390716 (+  1554us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:24.390719 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:24.390739 (+    20us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:24.390982 (+   243us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.390988 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.392330 (+  1342us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.392334 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:24.393225 (+   891us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.393234 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.393314 (+    80us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.393329 (+    15us) client_negotiation.cc:770] Sending connection context
0504 14:08:24.393367 (+    38us) client_negotiation.cc:241] Negotiation successful
0504 14:08:24.393413 (+    46us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":613,"thread_start_us":133,"threads_started":1}
I20260504 14:08:24.393572  2203 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.389180 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:50226 (local address 127.25.254.193:46541)
0504 14:08:24.389494 (+   314us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:24.389501 (+     7us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:24.389529 (+    28us) server_negotiation.cc:408] Connection header received
0504 14:08:24.389602 (+    73us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:24.389606 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:24.389763 (+   157us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:24.389921 (+   158us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:24.390399 (+   478us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.391291 (+   892us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.392129 (+   838us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.393252 (+  1123us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.393345 (+    93us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:24.393400 (+    55us) server_negotiation.cc:300] Negotiation successful
0504 14:08:24.393457 (+    57us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":190,"thread_start_us":86,"threads_started":1}
I20260504 14:08:24.394007  2199 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.389064 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:44781 (local address 127.0.0.1:35388)
0504 14:08:24.390298 (+  1234us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:24.390312 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:24.390401 (+    89us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:24.390971 (+   570us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:24.390973 (+     2us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:24.390989 (+    16us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:24.391235 (+   246us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.391242 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.392690 (+  1448us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.392694 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:24.393636 (+   942us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.393645 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.393756 (+   111us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.393776 (+    20us) client_negotiation.cc:770] Sending connection context
0504 14:08:24.393831 (+    55us) client_negotiation.cc:241] Negotiation successful
0504 14:08:24.393892 (+    61us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":1169,"thread_start_us":95,"threads_started":1}
I20260504 14:08:24.394814  2200 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.388479 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:37348 (local address 127.25.254.195:37043)
0504 14:08:24.390269 (+  1790us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:24.390275 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:24.390297 (+    22us) server_negotiation.cc:408] Connection header received
0504 14:08:24.390389 (+    92us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:24.390394 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:24.390567 (+   173us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:24.390715 (+   148us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:24.391117 (+   402us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.392198 (+  1081us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.393849 (+  1651us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.394425 (+   576us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.394531 (+   106us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:24.394595 (+    64us) server_negotiation.cc:300] Negotiation successful
0504 14:08:24.394685 (+    90us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":1635,"thread_start_us":121,"threads_started":1}
I20260504 14:08:24.395184  2204 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.389482 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:35388 (local address 127.25.254.194:44781)
0504 14:08:24.389791 (+   309us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:24.389797 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:24.390491 (+   694us) server_negotiation.cc:408] Connection header received
0504 14:08:24.390572 (+    81us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:24.390578 (+     6us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:24.390718 (+   140us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:24.390836 (+   118us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:24.391363 (+   527us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.392435 (+  1072us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.394253 (+  1818us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.394751 (+   498us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.394837 (+    86us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:24.394954 (+   117us) server_negotiation.cc:300] Negotiation successful
0504 14:08:24.395037 (+    83us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":209,"thread_start_us":132,"threads_started":1}
I20260504 14:08:24.396492  1832 tablet_service.cc:1511] Processing CreateTablet for tablet fbeff15a5ecd47b6a7d4130705c064f5 (DEFAULT_TABLE table=test-table [id=433d8bb1742b468f9649b3f3ac44ed73]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:24.397564  1832 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet fbeff15a5ecd47b6a7d4130705c064f5. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:24.397462  2109 tablet_service.cc:1511] Processing CreateTablet for tablet fbeff15a5ecd47b6a7d4130705c064f5 (DEFAULT_TABLE table=test-table [id=433d8bb1742b468f9649b3f3ac44ed73]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:24.397900  1973 tablet_service.cc:1511] Processing CreateTablet for tablet fbeff15a5ecd47b6a7d4130705c064f5 (DEFAULT_TABLE table=test-table [id=433d8bb1742b468f9649b3f3ac44ed73]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:24.398442  2109 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet fbeff15a5ecd47b6a7d4130705c064f5. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:24.398772  1973 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet fbeff15a5ecd47b6a7d4130705c064f5. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:24.404155  2205 tablet_bootstrap.cc:492] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507: Bootstrap starting.
I20260504 14:08:24.405586  2207 tablet_bootstrap.cc:492] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0: Bootstrap starting.
I20260504 14:08:24.406661  2205 tablet_bootstrap.cc:654] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:24.407497  2205 log.cc:826] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:24.407677  2207 tablet_bootstrap.cc:654] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:24.408390  2207 log.cc:826] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:24.409053  2206 tablet_bootstrap.cc:492] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10: Bootstrap starting.
I20260504 14:08:24.409518  2205 tablet_bootstrap.cc:492] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507: No bootstrap required, opened a new log
I20260504 14:08:24.409727  2205 ts_tablet_manager.cc:1403] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507: Time spent bootstrapping tablet: real 0.006s	user 0.004s	sys 0.000s
I20260504 14:08:24.410243  2207 tablet_bootstrap.cc:492] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0: No bootstrap required, opened a new log
I20260504 14:08:24.410534  2207 ts_tablet_manager.cc:1403] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0: Time spent bootstrapping tablet: real 0.005s	user 0.004s	sys 0.000s
I20260504 14:08:24.411185  2206 tablet_bootstrap.cc:654] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:24.411953  2206 log.cc:826] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:24.413057  2205 raft_consensus.cc:359] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } } peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } }
I20260504 14:08:24.413275  2205 raft_consensus.cc:385] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:24.413328  2205 raft_consensus.cc:740] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3162d5c41c52485598abe7a0a0eec507, State: Initialized, Role: FOLLOWER
I20260504 14:08:24.413573  2207 raft_consensus.cc:359] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } } peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } }
I20260504 14:08:24.413779  2207 raft_consensus.cc:385] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:24.413827  2207 raft_consensus.cc:740] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 455287b19bad4de4a0b2b1c878c6b1d0, State: Initialized, Role: FOLLOWER
I20260504 14:08:24.413816  2205 consensus_queue.cc:260] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } } peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } }
I20260504 14:08:24.413931  2206 tablet_bootstrap.cc:492] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10: No bootstrap required, opened a new log
I20260504 14:08:24.414129  2206 ts_tablet_manager.cc:1403] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10: Time spent bootstrapping tablet: real 0.005s	user 0.003s	sys 0.000s
I20260504 14:08:24.414227  2207 consensus_queue.cc:260] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } } peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } }
I20260504 14:08:24.414749  2205 ts_tablet_manager.cc:1434] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507: Time spent starting tablet: real 0.005s	user 0.006s	sys 0.000s
I20260504 14:08:24.414887  2039 heartbeater.cc:499] Master 127.25.254.254:41049 was elected leader, sending a full tablet report...
I20260504 14:08:24.415050  1902 heartbeater.cc:499] Master 127.25.254.254:41049 was elected leader, sending a full tablet report...
I20260504 14:08:24.415076  2207 ts_tablet_manager.cc:1434] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0: Time spent starting tablet: real 0.004s	user 0.004s	sys 0.000s
I20260504 14:08:24.418458  2206 raft_consensus.cc:359] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } } peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } }
I20260504 14:08:24.418689  2206 raft_consensus.cc:385] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:24.418762  2206 raft_consensus.cc:740] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: e7715d2595ab45af9b8b972e34e95e10, State: Initialized, Role: FOLLOWER
I20260504 14:08:24.419219  2206 consensus_queue.cc:260] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } } peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } }
I20260504 14:08:24.420002  2175 heartbeater.cc:499] Master 127.25.254.254:41049 was elected leader, sending a full tablet report...
I20260504 14:08:24.420192  2206 ts_tablet_manager.cc:1434] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10: Time spent starting tablet: real 0.006s	user 0.004s	sys 0.000s
W20260504 14:08:24.423125  1905 tablet.cc:2404] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20260504 14:08:24.559732  2213 raft_consensus.cc:493] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:08:24.559923  2213 raft_consensus.cc:515] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } } peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } }
I20260504 14:08:24.561203  2213 leader_election.cc:290] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 3162d5c41c52485598abe7a0a0eec507 (127.25.254.193:46541), 455287b19bad4de4a0b2b1c878c6b1d0 (127.25.254.194:44781)
I20260504 14:08:24.564288  2179 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.561538 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:44781 (local address 127.25.254.195:58483)
0504 14:08:24.561676 (+   138us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:24.561693 (+    17us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:24.561795 (+   102us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:24.562194 (+   399us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:24.562197 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:24.562220 (+    23us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:24.562490 (+   270us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.562497 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.563289 (+   792us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.563292 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:24.563914 (+   622us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.563922 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.564029 (+   107us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.564043 (+    14us) client_negotiation.cc:770] Sending connection context
0504 14:08:24.564080 (+    37us) client_negotiation.cc:241] Negotiation successful
0504 14:08:24.564152 (+    72us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":45}
I20260504 14:08:24.564431  2181 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.561538 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:46541 (local address 127.25.254.195:35869)
0504 14:08:24.561897 (+   359us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:24.561913 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:24.562014 (+   101us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:24.562365 (+   351us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:24.562368 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:24.562384 (+    16us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:24.562602 (+   218us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.562609 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.563400 (+   791us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.563403 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:24.564060 (+   657us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:24.564068 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.564176 (+   108us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.564190 (+    14us) client_negotiation.cc:770] Sending connection context
0504 14:08:24.564239 (+    49us) client_negotiation.cc:241] Negotiation successful
0504 14:08:24.564308 (+    69us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":277,"mutex_wait_us":145}
I20260504 14:08:24.564805  2204 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.561738 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:58483 (local address 127.25.254.194:44781)
0504 14:08:24.561877 (+   139us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:24.561881 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:24.561898 (+    17us) server_negotiation.cc:408] Connection header received
0504 14:08:24.561958 (+    60us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:24.561965 (+     7us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:24.562022 (+    57us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:24.562103 (+    81us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:24.562619 (+   516us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.563155 (+   536us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.564058 (+   903us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.564528 (+   470us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.564571 (+    43us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:24.564647 (+    76us) server_negotiation.cc:300] Negotiation successful
0504 14:08:24.564714 (+    67us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":45}
I20260504 14:08:24.564989  2203 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.561623 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:35869 (local address 127.25.254.193:46541)
0504 14:08:24.561806 (+   183us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:24.561811 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:24.562001 (+   190us) server_negotiation.cc:408] Connection header received
0504 14:08:24.562196 (+   195us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:24.562201 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:24.562249 (+    48us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:24.562344 (+    95us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:24.562766 (+   422us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.563280 (+   514us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.564196 (+   916us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.564705 (+   509us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.564737 (+    32us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:24.564788 (+    51us) server_negotiation.cc:300] Negotiation successful
0504 14:08:24.564847 (+    59us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":91}
I20260504 14:08:24.565491  1993 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fbeff15a5ecd47b6a7d4130705c064f5" candidate_uuid: "e7715d2595ab45af9b8b972e34e95e10" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" is_pre_election: true
I20260504 14:08:24.565645  1856 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fbeff15a5ecd47b6a7d4130705c064f5" candidate_uuid: "e7715d2595ab45af9b8b972e34e95e10" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3162d5c41c52485598abe7a0a0eec507" is_pre_election: true
I20260504 14:08:24.565838  1993 raft_consensus.cc:2468] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate e7715d2595ab45af9b8b972e34e95e10 in term 0.
I20260504 14:08:24.565991  1856 raft_consensus.cc:2468] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate e7715d2595ab45af9b8b972e34e95e10 in term 0.
I20260504 14:08:24.566371  2064 leader_election.cc:304] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 455287b19bad4de4a0b2b1c878c6b1d0, e7715d2595ab45af9b8b972e34e95e10; no voters: 
I20260504 14:08:24.566820  2213 raft_consensus.cc:2804] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:08:24.566882  2213 raft_consensus.cc:493] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:08:24.566911  2213 raft_consensus.cc:3060] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:24.567858  2213 raft_consensus.cc:515] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } } peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } }
I20260504 14:08:24.568241  2213 leader_election.cc:290] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [CANDIDATE]: Term 1 election: Requested vote from peers 3162d5c41c52485598abe7a0a0eec507 (127.25.254.193:46541), 455287b19bad4de4a0b2b1c878c6b1d0 (127.25.254.194:44781)
I20260504 14:08:24.568610  1856 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fbeff15a5ecd47b6a7d4130705c064f5" candidate_uuid: "e7715d2595ab45af9b8b972e34e95e10" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3162d5c41c52485598abe7a0a0eec507"
I20260504 14:08:24.568771  1856 raft_consensus.cc:3060] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:24.568919  1993 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fbeff15a5ecd47b6a7d4130705c064f5" candidate_uuid: "e7715d2595ab45af9b8b972e34e95e10" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "455287b19bad4de4a0b2b1c878c6b1d0"
I20260504 14:08:24.569056  1993 raft_consensus.cc:3060] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:24.569665  1856 raft_consensus.cc:2468] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate e7715d2595ab45af9b8b972e34e95e10 in term 1.
I20260504 14:08:24.570029  2062 leader_election.cc:304] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3162d5c41c52485598abe7a0a0eec507, e7715d2595ab45af9b8b972e34e95e10; no voters: 
I20260504 14:08:24.570248  2213 raft_consensus.cc:2804] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:08:24.570402  1993 raft_consensus.cc:2468] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate e7715d2595ab45af9b8b972e34e95e10 in term 1.
I20260504 14:08:24.570515  2213 raft_consensus.cc:697] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [term 1 LEADER]: Becoming Leader. State: Replica: e7715d2595ab45af9b8b972e34e95e10, State: Running, Role: LEADER
I20260504 14:08:24.570919  2213 consensus_queue.cc:237] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } } peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } }
I20260504 14:08:24.574496   904 catalog_manager.cc:5671] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 reported cstate change: term changed from 0 to 1, leader changed from <none> to e7715d2595ab45af9b8b972e34e95e10 (127.25.254.195). New cstate: current_term: 1 leader_uuid: "e7715d2595ab45af9b8b972e34e95e10" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } health_report { overall_health: UNKNOWN } } }
I20260504 14:08:24.589640  2200 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.586030 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:37350 (local address 127.25.254.195:37043)
0504 14:08:24.586231 (+   201us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:24.586235 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:24.586249 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:08:24.586342 (+    93us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:24.586345 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:24.586397 (+    52us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:24.586469 (+    72us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:08:24.586917 (+   448us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.587797 (+   880us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.588556 (+   759us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.588716 (+   160us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.588799 (+    83us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:08:24.589164 (+   365us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:08:24.589256 (+    92us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:24.589470 (+   214us) server_negotiation.cc:300] Negotiation successful
0504 14:08:24.589520 (+    50us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":59}
I20260504 14:08:24.595328  1856 raft_consensus.cc:1275] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507 [term 1 FOLLOWER]: Refusing update from remote peer e7715d2595ab45af9b8b972e34e95e10: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:08:24.595328  1993 raft_consensus.cc:1275] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0 [term 1 FOLLOWER]: Refusing update from remote peer e7715d2595ab45af9b8b972e34e95e10: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:08:24.596117  2213 consensus_queue.cc:1048] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [LEADER]: Connected to new peer: Peer: permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:08:24.596300  2214 consensus_queue.cc:1048] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [LEADER]: Connected to new peer: Peer: permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
W20260504 14:08:24.598536  2178 tablet.cc:2404] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20260504 14:08:24.604708  2218 mvcc.cc:204] Tried to move back new op lower bound from 7282293574014943232 to 7282293573925490688. Current Snapshot: MvccSnapshot[applied={T|T < 7282293574014943232}]
I20260504 14:08:24.606057  2219 mvcc.cc:204] Tried to move back new op lower bound from 7282293574014943232 to 7282293573925490688. Current Snapshot: MvccSnapshot[applied={T|T < 7282293574014943232}]
I20260504 14:08:24.616815   904 catalog_manager.cc:2507] Servicing SoftDeleteTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:51766:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:08:24.617002   904 catalog_manager.cc:2755] Servicing DeleteTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:51766:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:08:24.621757   904 catalog_manager.cc:5958] T 00000000000000000000000000000000 P 8fc8681ad972459a8b49398127c6b42c: Sending DeleteTablet for 3 replicas of tablet fbeff15a5ecd47b6a7d4130705c064f5
I20260504 14:08:24.622718  1832 tablet_service.cc:1558] Processing DeleteTablet for tablet fbeff15a5ecd47b6a7d4130705c064f5 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:24 UTC) from {username='oryx', principal='oryx/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:50226
I20260504 14:08:24.622905  2109 tablet_service.cc:1558] Processing DeleteTablet for tablet fbeff15a5ecd47b6a7d4130705c064f5 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:24 UTC) from {username='oryx', principal='oryx/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:37348
I20260504 14:08:24.623382  1973 tablet_service.cc:1558] Processing DeleteTablet for tablet fbeff15a5ecd47b6a7d4130705c064f5 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:24 UTC) from {username='oryx', principal='oryx/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:35388
I20260504 14:08:24.623438  2229 tablet_replica.cc:333] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507: stopping tablet replica
I20260504 14:08:24.623744  2229 raft_consensus.cc:2243] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:08:24.624112  2229 raft_consensus.cc:2272] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:24.624733 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 873
I20260504 14:08:24.626359  2229 ts_tablet_manager.cc:1916] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:24.629158  2229 ts_tablet_manager.cc:1929] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:08:24.629281  2229 log.cc:1199] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-0/wal/wals/fbeff15a5ecd47b6a7d4130705c064f5
I20260504 14:08:24.629616  2229 ts_tablet_manager.cc:1950] T fbeff15a5ecd47b6a7d4130705c064f5 P 3162d5c41c52485598abe7a0a0eec507: Deleting consensus metadata
I20260504 14:08:24.629973  2230 tablet_replica.cc:333] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10: stopping tablet replica
I20260504 14:08:24.630188  2231 tablet_replica.cc:333] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0: stopping tablet replica
I20260504 14:08:24.630582  2230 raft_consensus.cc:2243] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [term 1 LEADER]: Raft consensus shutting down.
I20260504 14:08:24.630987  2231 raft_consensus.cc:2243] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:08:24.631421  2231 raft_consensus.cc:2272] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:24.631925  2230 raft_consensus.cc:2272] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10 [term 1 FOLLOWER]: Raft consensus is shut down!
W20260504 14:08:24.633200  1789 connection.cc:570] server connection from 127.0.0.1:50226 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20260504 14:08:24.633987  2231 ts_tablet_manager.cc:1916] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:24.635108  2230 ts_tablet_manager.cc:1916] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:24.637959  2230 ts_tablet_manager.cc:1929] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:08:24.637959  2231 ts_tablet_manager.cc:1929] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:08:24.638067  2231 log.cc:1199] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/wal/wals/fbeff15a5ecd47b6a7d4130705c064f5
I20260504 14:08:24.638139  2230 log.cc:1199] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/wal/wals/fbeff15a5ecd47b6a7d4130705c064f5
I20260504 14:08:24.638469  2231 ts_tablet_manager.cc:1950] T fbeff15a5ecd47b6a7d4130705c064f5 P 455287b19bad4de4a0b2b1c878c6b1d0: Deleting consensus metadata
I20260504 14:08:24.638511  2230 ts_tablet_manager.cc:1950] T fbeff15a5ecd47b6a7d4130705c064f5 P e7715d2595ab45af9b8b972e34e95e10: Deleting consensus metadata
W20260504 14:08:24.639632  1927 connection.cc:463] server connection from 127.0.0.1:35388 torn down before Call kudu.tserver.TabletServerAdminService.DeleteTablet from 127.0.0.1:35388 (request call id 1) could send its response
W20260504 14:08:24.639686  2064 connection.cc:463] server connection from 127.0.0.1:37348 torn down before Call kudu.tserver.TabletServerAdminService.DeleteTablet from 127.0.0.1:37348 (request call id 1) could send its response
I20260504 14:08:24.650092  2188 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.640569 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:35278 (local address 127.25.254.252:44857)
0504 14:08:24.640728 (+   159us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:24.640733 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:24.641335 (+   602us) server_negotiation.cc:408] Connection header received
0504 14:08:24.641404 (+    69us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:24.641411 (+     7us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:24.641482 (+    71us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:24.641595 (+   113us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:24.642642 (+  1047us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.643230 (+   588us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.644159 (+   929us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.644393 (+   234us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.646560 (+  2167us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:24.646601 (+    41us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:24.646604 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:24.646632 (+    28us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:24.648320 (+  1688us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.648943 (+   623us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.648946 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.648948 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.648994 (+    46us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.649236 (+   242us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.649239 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.649241 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.649383 (+   142us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:24.649512 (+   129us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:24.649751 (+   239us) server_negotiation.cc:300] Negotiation successful
0504 14:08:24.649879 (+   128us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":71}
I20260504 14:08:24.650563  2191 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:24.640147 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:42832 (local address 127.25.254.253:42377)
0504 14:08:24.640292 (+   145us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:24.640296 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:24.640628 (+   332us) server_negotiation.cc:408] Connection header received
0504 14:08:24.640894 (+   266us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:24.640898 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:24.640971 (+    73us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:24.641074 (+   103us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:24.642717 (+  1643us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.643519 (+   802us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:24.644419 (+   900us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:24.644623 (+   204us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:24.646407 (+  1784us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:24.646430 (+    23us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:24.646434 (+     4us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:24.646466 (+    32us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:24.648330 (+  1864us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.648883 (+   553us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.648887 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.648889 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.648951 (+    62us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:24.649208 (+   257us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:24.649212 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:24.649214 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:24.649389 (+   175us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:24.649484 (+    95us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:24.650224 (+   740us) server_negotiation.cc:300] Negotiation successful
0504 14:08:24.650352 (+   128us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":60}
I20260504 14:08:25.609256  2238 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:25.608786 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.195:36967)
0504 14:08:25.609053 (+   267us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:25.609136 (+    83us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":175,"thread_start_us":77,"threads_started":1}
W20260504 14:08:25.609601  2175 heartbeater.cc:646] Failed to heartbeat to 127.25.254.254:41049 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
I20260504 14:08:25.610323  2239 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:25.609804 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.194:40649)
0504 14:08:25.610042 (+   238us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:25.610123 (+    81us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":163,"thread_start_us":93,"threads_started":1}
W20260504 14:08:25.610675  2039 heartbeater.cc:646] Failed to heartbeat to 127.25.254.254:41049 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
I20260504 14:08:25.612353  2240 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:25.611964 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.193:52441)
0504 14:08:25.612192 (+   228us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:25.612267 (+    75us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":143,"thread_start_us":76,"threads_started":1}
W20260504 14:08:25.612911  1902 heartbeater.cc:646] Failed to heartbeat to 127.25.254.254:41049 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
I20260504 14:08:26.122278  2241 raft_consensus.cc:493] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 8fc8681ad972459a8b49398127c6b42c)
I20260504 14:08:26.122426  2241 raft_consensus.cc:515] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: false } }
I20260504 14:08:26.123054  2241 leader_election.cc:290] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 8fc8681ad972459a8b49398127c6b42c (127.25.254.254:41049), 5f6047ff1b18475e986323ecedb0f8b5 (127.25.254.253:42377)
I20260504 14:08:26.123498  1533 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "00000000000000000000000000000000" candidate_uuid: "b782998ffee94b86a80d60ed27ff59f1" candidate_term: 2 candidate_status { last_received { term: 1 index: 12 } } ignore_live_leader: false dest_uuid: "5f6047ff1b18475e986323ecedb0f8b5" is_pre_election: true
I20260504 14:08:26.123631  2242 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.123231 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.0.0.1:52240)
0504 14:08:26.123464 (+   233us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.123527 (+    63us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":148,"thread_start_us":77,"threads_started":1}
I20260504 14:08:26.123782  1533 raft_consensus.cc:2468] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate b782998ffee94b86a80d60ed27ff59f1 in term 1.
I20260504 14:08:26.123931  2243 raft_consensus.cc:493] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 8fc8681ad972459a8b49398127c6b42c)
W20260504 14:08:26.123904  1695 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
I20260504 14:08:26.124029  2243 raft_consensus.cc:515] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: false } }
I20260504 14:08:26.124209  1694 leader_election.cc:304] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 5f6047ff1b18475e986323ecedb0f8b5, b782998ffee94b86a80d60ed27ff59f1; no voters: 
I20260504 14:08:26.124392  2241 raft_consensus.cc:2804] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 1 FOLLOWER]: Leader pre-election won for term 2
I20260504 14:08:26.124444  2241 raft_consensus.cc:493] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 1 FOLLOWER]: Starting leader election (detected failure of leader 8fc8681ad972459a8b49398127c6b42c)
I20260504 14:08:26.124480  2241 raft_consensus.cc:3060] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 1 FOLLOWER]: Advancing to term 2
I20260504 14:08:26.124877  2242 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.124585 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.0.0.1:52248)
0504 14:08:26.124761 (+   176us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.124808 (+    47us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":42}
W20260504 14:08:26.125109  1695 leader_election.cc:336] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer 8fc8681ad972459a8b49398127c6b42c (127.25.254.254:41049): Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
I20260504 14:08:26.125134  2243 leader_election.cc:290] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 8fc8681ad972459a8b49398127c6b42c (127.25.254.254:41049), b782998ffee94b86a80d60ed27ff59f1 (127.25.254.252:44857)
I20260504 14:08:26.125656  2245 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.125260 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.0.0.1:52262)
0504 14:08:26.125453 (+   193us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.125518 (+    65us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":123,"thread_start_us":66,"threads_started":1}
I20260504 14:08:26.125782  1719 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "00000000000000000000000000000000" candidate_uuid: "5f6047ff1b18475e986323ecedb0f8b5" candidate_term: 2 candidate_status { last_received { term: 1 index: 12 } } ignore_live_leader: false dest_uuid: "b782998ffee94b86a80d60ed27ff59f1" is_pre_election: true
W20260504 14:08:26.125805  1509 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
I20260504 14:08:26.125981  2241 raft_consensus.cc:515] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 2 FOLLOWER]: Starting leader election with config: opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: false } }
I20260504 14:08:26.126191  1719 raft_consensus.cc:2393] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 2 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 5f6047ff1b18475e986323ecedb0f8b5 in current term 2: Already voted for candidate b782998ffee94b86a80d60ed27ff59f1 in this term.
I20260504 14:08:26.126359  2241 leader_election.cc:290] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [CANDIDATE]: Term 2 election: Requested vote from peers 8fc8681ad972459a8b49398127c6b42c (127.25.254.254:41049), 5f6047ff1b18475e986323ecedb0f8b5 (127.25.254.253:42377)
I20260504 14:08:26.126688  2245 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.126413 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.0.0.1:52274)
0504 14:08:26.126564 (+   151us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.126622 (+    58us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":78}
I20260504 14:08:26.126751  2242 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.126423 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.0.0.1:52282)
0504 14:08:26.126640 (+   217us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.126687 (+    47us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":140}
W20260504 14:08:26.126768  1509 leader_election.cc:336] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer 8fc8681ad972459a8b49398127c6b42c (127.25.254.254:41049): Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
I20260504 14:08:26.126793  1533 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "00000000000000000000000000000000" candidate_uuid: "b782998ffee94b86a80d60ed27ff59f1" candidate_term: 2 candidate_status { last_received { term: 1 index: 12 } } ignore_live_leader: false dest_uuid: "5f6047ff1b18475e986323ecedb0f8b5"
I20260504 14:08:26.126888  1533 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 1 FOLLOWER]: Advancing to term 2
I20260504 14:08:26.126869  1509 leader_election.cc:304] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 5f6047ff1b18475e986323ecedb0f8b5; no voters: 8fc8681ad972459a8b49398127c6b42c, b782998ffee94b86a80d60ed27ff59f1
I20260504 14:08:26.127333  2242 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.127090 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.0.0.1:52284)
0504 14:08:26.127216 (+   126us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.127276 (+    60us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":66}
W20260504 14:08:26.127442  1695 leader_election.cc:336] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer 8fc8681ad972459a8b49398127c6b42c (127.25.254.254:41049): Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
I20260504 14:08:26.127964  1533 raft_consensus.cc:2468] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate b782998ffee94b86a80d60ed27ff59f1 in term 2.
I20260504 14:08:26.128175  2243 raft_consensus.cc:2749] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 2 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20260504 14:08:26.128297  1694 leader_election.cc:304] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 5f6047ff1b18475e986323ecedb0f8b5, b782998ffee94b86a80d60ed27ff59f1; no voters: 8fc8681ad972459a8b49398127c6b42c
I20260504 14:08:26.128484  2241 raft_consensus.cc:2804] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 2 FOLLOWER]: Leader election won for term 2
I20260504 14:08:26.128669  2241 raft_consensus.cc:697] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [term 2 LEADER]: Becoming Leader. State: Replica: b782998ffee94b86a80d60ed27ff59f1, State: Running, Role: LEADER
I20260504 14:08:26.128918  2241 consensus_queue.cc:237] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 12, Committed index: 12, Last appended: 1.12, Last appended by leader: 12, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: false } }
I20260504 14:08:26.130370  2247 sys_catalog.cc:455] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [sys.catalog]: SysCatalogTable state changed. Reason: New leader b782998ffee94b86a80d60ed27ff59f1. Latest consensus state: current_term: 2 leader_uuid: "b782998ffee94b86a80d60ed27ff59f1" committed_config { opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: false } } }
I20260504 14:08:26.130620  2247 sys_catalog.cc:458] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:26.130987  2249 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:08:26.133116  2249 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:08:26.133526  2249 catalog_manager.cc:1269] Loaded cluster ID: c0935c8e513349598af8d1e724bba87c
I20260504 14:08:26.133613  2249 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:08:26.134146  2249 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:08:26.134603  2249 catalog_manager.cc:6055] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: Loaded TSK: 0
I20260504 14:08:26.135176  2249 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20260504 14:08:26.173789  1709 catalog_manager.cc:2257] Servicing CreateTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:35278:
name: "test-table"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "val"
    type: INT32
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
W20260504 14:08:26.176383  1709 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-table in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260504 14:08:26.179576  2242 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.179317 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.0.0.1:52292)
0504 14:08:26.179449 (+   132us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.179494 (+    45us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":44}
I20260504 14:08:26.179663  1533 raft_consensus.cc:1275] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [term 2 FOLLOWER]: Refusing update from remote peer b782998ffee94b86a80d60ed27ff59f1: Log matching property violated. Preceding OpId in replica: term: 1 index: 12. Preceding OpId from leader: term: 2 index: 14. (index mismatch)
I20260504 14:08:26.180173  2247 consensus_queue.cc:1048] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [LEADER]: Connected to new peer: Peer: permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 13, Last known committed idx: 12, Time since last communication: 0.000s
I20260504 14:08:26.180256  2242 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.180047 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.0.0.1:52306)
0504 14:08:26.180158 (+   111us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.180203 (+    45us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":40}
W20260504 14:08:26.180410  1695 consensus_peers.cc:597] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 -> Peer 8fc8681ad972459a8b49398127c6b42c (127.25.254.254:41049): Couldn't send request to peer 8fc8681ad972459a8b49398127c6b42c. Status: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20260504 14:08:26.181699  2243 sys_catalog.cc:455] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: SysCatalogTable state changed. Reason: New leader b782998ffee94b86a80d60ed27ff59f1. Latest consensus state: current_term: 2 leader_uuid: "b782998ffee94b86a80d60ed27ff59f1" committed_config { opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: false } } }
I20260504 14:08:26.181815  2243 sys_catalog.cc:458] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: This master's current role is: FOLLOWER
I20260504 14:08:26.183256  2247 sys_catalog.cc:455] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [sys.catalog]: SysCatalogTable state changed. Reason: Peer health change. Latest consensus state: current_term: 2 leader_uuid: "b782998ffee94b86a80d60ed27ff59f1" committed_config { opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: false } } }
I20260504 14:08:26.183378  2247 sys_catalog.cc:458] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1 [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:26.183765  2243 sys_catalog.cc:455] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: SysCatalogTable state changed. Reason: Replicated consensus-only round. Latest consensus state: current_term: 2 leader_uuid: "b782998ffee94b86a80d60ed27ff59f1" committed_config { opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "8fc8681ad972459a8b49398127c6b42c" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 41049 } } peers { permanent_uuid: "5f6047ff1b18475e986323ecedb0f8b5" member_type: VOTER last_known_addr { host: "127.25.254.253" port: 42377 } attrs { promote: false } } peers { permanent_uuid: "b782998ffee94b86a80d60ed27ff59f1" member_type: VOTER last_known_addr { host: "127.25.254.252" port: 44857 } attrs { promote: false } } }
I20260504 14:08:26.183918  2243 sys_catalog.cc:458] T 00000000000000000000000000000000 P 5f6047ff1b18475e986323ecedb0f8b5 [sys.catalog]: This master's current role is: FOLLOWER
I20260504 14:08:26.194834  2257 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.190920 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:46541 (local address 127.0.0.1:50234)
0504 14:08:26.191263 (+   343us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.191282 (+    19us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:26.191349 (+    67us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:26.191860 (+   511us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:26.191864 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:26.191881 (+    17us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:26.192149 (+   268us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:26.192156 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:26.193327 (+  1171us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:26.193334 (+     7us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:26.194425 (+  1091us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:26.194433 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:26.194556 (+   123us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:26.194574 (+    18us) client_negotiation.cc:770] Sending connection context
0504 14:08:26.194626 (+    52us) client_negotiation.cc:241] Negotiation successful
0504 14:08:26.194682 (+    56us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":256,"thread_start_us":147,"threads_started":1}
I20260504 14:08:26.194869  2242 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.190614 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:44781 (local address 127.0.0.1:35392)
0504 14:08:26.190743 (+   129us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.190762 (+    19us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:26.190864 (+   102us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:26.192393 (+  1529us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:26.192396 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:26.192418 (+    22us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:26.192655 (+   237us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:26.192664 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:26.193631 (+   967us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:26.193635 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:26.194575 (+   940us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:26.194582 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:26.194660 (+    78us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:26.194675 (+    15us) client_negotiation.cc:770] Sending connection context
0504 14:08:26.194716 (+    41us) client_negotiation.cc:241] Negotiation successful
0504 14:08:26.194780 (+    64us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":36}
I20260504 14:08:26.195585  2258 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.191365 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:50234 (local address 127.25.254.193:46541)
0504 14:08:26.191602 (+   237us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:26.191605 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:26.191616 (+    11us) server_negotiation.cc:408] Connection header received
0504 14:08:26.191697 (+    81us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:26.191700 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:26.191748 (+    48us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:26.191849 (+   101us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:26.192291 (+   442us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:26.193215 (+   924us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:26.194547 (+  1332us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:26.195315 (+   768us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:26.195354 (+    39us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:26.195407 (+    53us) server_negotiation.cc:300] Negotiation successful
0504 14:08:26.195457 (+    50us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":163,"thread_start_us":65,"threads_started":1}
I20260504 14:08:26.195894  2259 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.191984 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:37043 (local address 127.0.0.1:37366)
0504 14:08:26.192254 (+   270us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.192268 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:26.192337 (+    69us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:26.192764 (+   427us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:26.192767 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:26.192788 (+    21us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:26.193039 (+   251us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:26.193046 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:26.194306 (+  1260us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:26.194310 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:26.195558 (+  1248us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:26.195567 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:26.195689 (+   122us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:26.195704 (+    15us) client_negotiation.cc:770] Sending connection context
0504 14:08:26.195755 (+    51us) client_negotiation.cc:241] Negotiation successful
0504 14:08:26.195804 (+    49us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":190,"thread_start_us":97,"threads_started":1}
I20260504 14:08:26.196276  1832 tablet_service.cc:1511] Processing CreateTablet for tablet e9432a444c8a4a26ba2acc2f9bd88b10 (DEFAULT_TABLE table=test-table [id=9180f532e41542bf8249a4d153579e93]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:26.196424  2260 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.192260 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:37366 (local address 127.25.254.195:37043)
0504 14:08:26.192516 (+   256us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:26.192521 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:26.192534 (+    13us) server_negotiation.cc:408] Connection header received
0504 14:08:26.192582 (+    48us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:26.192586 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:26.192628 (+    42us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:26.192734 (+   106us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:26.193175 (+   441us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:26.194076 (+   901us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:26.195709 (+  1633us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:26.196163 (+   454us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:26.196195 (+    32us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:26.196245 (+    50us) server_negotiation.cc:300] Negotiation successful
0504 14:08:26.196305 (+    60us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":186,"thread_start_us":81,"threads_started":1}
I20260504 14:08:26.196594  1832 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet e9432a444c8a4a26ba2acc2f9bd88b10. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:26.197033  2109 tablet_service.cc:1511] Processing CreateTablet for tablet e9432a444c8a4a26ba2acc2f9bd88b10 (DEFAULT_TABLE table=test-table [id=9180f532e41542bf8249a4d153579e93]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:26.197273  2109 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet e9432a444c8a4a26ba2acc2f9bd88b10. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:26.198849  2256 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.190708 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:35392 (local address 127.25.254.194:44781)
0504 14:08:26.192131 (+  1423us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:26.192135 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:26.192149 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:08:26.192202 (+    53us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:26.192205 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:26.192255 (+    50us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:26.192771 (+   516us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:26.192871 (+   100us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:26.193505 (+   634us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:26.195381 (+  1876us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:26.198582 (+  3201us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:26.198618 (+    36us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:26.198693 (+    75us) server_negotiation.cc:300] Negotiation successful
0504 14:08:26.198747 (+    54us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":1324,"thread_start_us":94,"threads_started":1}
I20260504 14:08:26.199329  2261 tablet_bootstrap.cc:492] T e9432a444c8a4a26ba2acc2f9bd88b10 P 3162d5c41c52485598abe7a0a0eec507: Bootstrap starting.
I20260504 14:08:26.199820  1973 tablet_service.cc:1511] Processing CreateTablet for tablet e9432a444c8a4a26ba2acc2f9bd88b10 (DEFAULT_TABLE table=test-table [id=9180f532e41542bf8249a4d153579e93]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:26.200094  1973 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet e9432a444c8a4a26ba2acc2f9bd88b10. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:26.200372  2261 tablet_bootstrap.cc:654] T e9432a444c8a4a26ba2acc2f9bd88b10 P 3162d5c41c52485598abe7a0a0eec507: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:26.201926  2261 tablet_bootstrap.cc:492] T e9432a444c8a4a26ba2acc2f9bd88b10 P 3162d5c41c52485598abe7a0a0eec507: No bootstrap required, opened a new log
I20260504 14:08:26.202041  2261 ts_tablet_manager.cc:1403] T e9432a444c8a4a26ba2acc2f9bd88b10 P 3162d5c41c52485598abe7a0a0eec507: Time spent bootstrapping tablet: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:08:26.202615  2264 tablet_bootstrap.cc:492] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0: Bootstrap starting.
I20260504 14:08:26.202581  2261 raft_consensus.cc:359] T e9432a444c8a4a26ba2acc2f9bd88b10 P 3162d5c41c52485598abe7a0a0eec507 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } } peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } }
I20260504 14:08:26.202692  2261 raft_consensus.cc:385] T e9432a444c8a4a26ba2acc2f9bd88b10 P 3162d5c41c52485598abe7a0a0eec507 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:26.202701  2262 tablet_bootstrap.cc:492] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10: Bootstrap starting.
I20260504 14:08:26.202729  2261 raft_consensus.cc:740] T e9432a444c8a4a26ba2acc2f9bd88b10 P 3162d5c41c52485598abe7a0a0eec507 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3162d5c41c52485598abe7a0a0eec507, State: Initialized, Role: FOLLOWER
I20260504 14:08:26.202884  2261 consensus_queue.cc:260] T e9432a444c8a4a26ba2acc2f9bd88b10 P 3162d5c41c52485598abe7a0a0eec507 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } } peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } }
I20260504 14:08:26.203609  2264 tablet_bootstrap.cc:654] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:26.204036  2262 tablet_bootstrap.cc:654] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:26.204226  2266 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.203714 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.193:56651)
0504 14:08:26.203983 (+   269us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.204035 (+    52us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":165,"thread_start_us":95,"threads_started":1}
I20260504 14:08:26.205090  2261 ts_tablet_manager.cc:1434] T e9432a444c8a4a26ba2acc2f9bd88b10 P 3162d5c41c52485598abe7a0a0eec507: Time spent starting tablet: real 0.003s	user 0.001s	sys 0.000s
I20260504 14:08:26.205391  2262 tablet_bootstrap.cc:492] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10: No bootstrap required, opened a new log
I20260504 14:08:26.205490  2262 ts_tablet_manager.cc:1403] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10: Time spent bootstrapping tablet: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:08:26.205721  2266 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.205451 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.193:35717)
0504 14:08:26.205604 (+   153us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.205659 (+    55us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":83}
I20260504 14:08:26.205991  2262 raft_consensus.cc:359] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } } peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } }
I20260504 14:08:26.206103  2262 raft_consensus.cc:385] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:26.206137  2262 raft_consensus.cc:740] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: e7715d2595ab45af9b8b972e34e95e10, State: Initialized, Role: FOLLOWER
I20260504 14:08:26.206226  1903 heartbeater.cc:499] Master 127.25.254.252:44857 was elected leader, sending a full tablet report...
I20260504 14:08:26.206317  2262 consensus_queue.cc:260] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } } peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } }
I20260504 14:08:26.206665  2262 ts_tablet_manager.cc:1434] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:08:26.208060  2270 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.207108 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.195:39971)
0504 14:08:26.207896 (+   788us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.207966 (+    70us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":707,"thread_start_us":96,"threads_started":1}
I20260504 14:08:26.208458  2176 heartbeater.cc:499] Master 127.25.254.252:44857 was elected leader, sending a full tablet report...
I20260504 14:08:26.208657  2264 tablet_bootstrap.cc:492] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0: No bootstrap required, opened a new log
I20260504 14:08:26.208763  2264 ts_tablet_manager.cc:1403] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0: Time spent bootstrapping tablet: real 0.006s	user 0.002s	sys 0.000s
I20260504 14:08:26.208786  2270 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.208548 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.195:49339)
0504 14:08:26.208645 (+    97us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.208693 (+    48us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":25}
I20260504 14:08:26.209272  2264 raft_consensus.cc:359] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } } peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } }
I20260504 14:08:26.209368  2264 raft_consensus.cc:385] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:26.209472  2264 raft_consensus.cc:740] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 455287b19bad4de4a0b2b1c878c6b1d0, State: Initialized, Role: FOLLOWER
I20260504 14:08:26.209610  2264 consensus_queue.cc:260] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } } peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } }
I20260504 14:08:26.209957  2264 ts_tablet_manager.cc:1434] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0: Time spent starting tablet: real 0.001s	user 0.002s	sys 0.000s
I20260504 14:08:26.210726  2272 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.210190 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.194:36987)
0504 14:08:26.210471 (+   281us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.210538 (+    67us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":173,"thread_start_us":94,"threads_started":1}
I20260504 14:08:26.211580  2040 heartbeater.cc:499] Master 127.25.254.252:44857 was elected leader, sending a full tablet report...
I20260504 14:08:26.211653  2272 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.211212 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.194:38489)
0504 14:08:26.211305 (+    93us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.211343 (+    38us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":26}
I20260504 14:08:26.238549  2269 raft_consensus.cc:493] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:08:26.238833  2269 raft_consensus.cc:515] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } } peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } }
I20260504 14:08:26.239337  2269 leader_election.cc:290] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 455287b19bad4de4a0b2b1c878c6b1d0 (127.25.254.194:44781), 3162d5c41c52485598abe7a0a0eec507 (127.25.254.193:46541)
I20260504 14:08:26.239799  1993 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "e9432a444c8a4a26ba2acc2f9bd88b10" candidate_uuid: "e7715d2595ab45af9b8b972e34e95e10" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" is_pre_election: true
I20260504 14:08:26.239995  1993 raft_consensus.cc:2468] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate e7715d2595ab45af9b8b972e34e95e10 in term 0.
I20260504 14:08:26.239984  1856 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "e9432a444c8a4a26ba2acc2f9bd88b10" candidate_uuid: "e7715d2595ab45af9b8b972e34e95e10" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3162d5c41c52485598abe7a0a0eec507" is_pre_election: true
I20260504 14:08:26.240132  1856 raft_consensus.cc:2468] T e9432a444c8a4a26ba2acc2f9bd88b10 P 3162d5c41c52485598abe7a0a0eec507 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate e7715d2595ab45af9b8b972e34e95e10 in term 0.
I20260504 14:08:26.240381  2064 leader_election.cc:304] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 455287b19bad4de4a0b2b1c878c6b1d0, e7715d2595ab45af9b8b972e34e95e10; no voters: 
I20260504 14:08:26.240626  2269 raft_consensus.cc:2804] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:08:26.240701  2269 raft_consensus.cc:493] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:08:26.240770  2269 raft_consensus.cc:3060] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:26.241721  2269 raft_consensus.cc:515] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } } peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } }
I20260504 14:08:26.242084  2269 leader_election.cc:290] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [CANDIDATE]: Term 1 election: Requested vote from peers 455287b19bad4de4a0b2b1c878c6b1d0 (127.25.254.194:44781), 3162d5c41c52485598abe7a0a0eec507 (127.25.254.193:46541)
I20260504 14:08:26.242648  1856 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "e9432a444c8a4a26ba2acc2f9bd88b10" candidate_uuid: "e7715d2595ab45af9b8b972e34e95e10" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3162d5c41c52485598abe7a0a0eec507"
I20260504 14:08:26.242790  1856 raft_consensus.cc:3060] T e9432a444c8a4a26ba2acc2f9bd88b10 P 3162d5c41c52485598abe7a0a0eec507 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:26.243639  1856 raft_consensus.cc:2468] T e9432a444c8a4a26ba2acc2f9bd88b10 P 3162d5c41c52485598abe7a0a0eec507 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate e7715d2595ab45af9b8b972e34e95e10 in term 1.
I20260504 14:08:26.243901  1993 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "e9432a444c8a4a26ba2acc2f9bd88b10" candidate_uuid: "e7715d2595ab45af9b8b972e34e95e10" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "455287b19bad4de4a0b2b1c878c6b1d0"
I20260504 14:08:26.243952  2062 leader_election.cc:304] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3162d5c41c52485598abe7a0a0eec507, e7715d2595ab45af9b8b972e34e95e10; no voters: 
I20260504 14:08:26.244007  1993 raft_consensus.cc:3060] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:26.244168  2269 raft_consensus.cc:2804] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:08:26.244431  2269 raft_consensus.cc:697] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [term 1 LEADER]: Becoming Leader. State: Replica: e7715d2595ab45af9b8b972e34e95e10, State: Running, Role: LEADER
I20260504 14:08:26.244588  2269 consensus_queue.cc:237] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } } peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } }
I20260504 14:08:26.245150  1993 raft_consensus.cc:2468] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate e7715d2595ab45af9b8b972e34e95e10 in term 1.
I20260504 14:08:26.245196  2270 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.244918 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.195:42183)
0504 14:08:26.245066 (+   148us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.245116 (+    50us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":25}
I20260504 14:08:26.246405  1708 catalog_manager.cc:5671] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 reported cstate change: term changed from 0 to 1, leader changed from <none> to e7715d2595ab45af9b8b972e34e95e10 (127.25.254.195). New cstate: current_term: 1 leader_uuid: "e7715d2595ab45af9b8b972e34e95e10" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "e7715d2595ab45af9b8b972e34e95e10" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 37043 } health_report { overall_health: HEALTHY } } }
I20260504 14:08:26.260682  2260 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.256142 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:37378 (local address 127.25.254.195:37043)
0504 14:08:26.256270 (+   128us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:26.256274 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:26.256286 (+    12us) server_negotiation.cc:408] Connection header received
0504 14:08:26.256335 (+    49us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:26.256337 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:26.256381 (+    44us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:26.256452 (+    71us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:08:26.256949 (+   497us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:26.257652 (+   703us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:26.259378 (+  1726us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:26.259824 (+   446us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:26.259893 (+    69us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:08:26.260023 (+   130us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:08:26.260117 (+    94us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:26.260506 (+   389us) server_negotiation.cc:300] Negotiation successful
0504 14:08:26.260554 (+    48us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":47}
I20260504 14:08:26.263449  1856 raft_consensus.cc:1275] T e9432a444c8a4a26ba2acc2f9bd88b10 P 3162d5c41c52485598abe7a0a0eec507 [term 1 FOLLOWER]: Refusing update from remote peer e7715d2595ab45af9b8b972e34e95e10: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:08:26.263615  1993 raft_consensus.cc:1275] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0 [term 1 FOLLOWER]: Refusing update from remote peer e7715d2595ab45af9b8b972e34e95e10: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:08:26.263979  2269 consensus_queue.cc:1048] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [LEADER]: Connected to new peer: Peer: permanent_uuid: "3162d5c41c52485598abe7a0a0eec507" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 46541 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:08:26.264137  2273 consensus_queue.cc:1048] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [LEADER]: Connected to new peer: Peer: permanent_uuid: "455287b19bad4de4a0b2b1c878c6b1d0" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44781 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:08:26.266759  2272 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.266023 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.194:43473)
0504 14:08:26.266616 (+   593us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.266673 (+    57us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":486}
I20260504 14:08:26.267138  2266 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.266474 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.193:48041)
0504 14:08:26.267003 (+   529us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.267054 (+    51us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":423}
I20260504 14:08:26.269562  2272 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.269287 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.194:36685)
0504 14:08:26.269438 (+   151us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.269484 (+    46us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":72}
I20260504 14:08:26.269831  2270 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.268330 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.195:52025)
0504 14:08:26.269705 (+  1375us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.269758 (+    53us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":1284}
I20260504 14:08:26.279131  2266 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.278850 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.193:56219)
0504 14:08:26.278975 (+   125us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.279037 (+    62us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":34}
I20260504 14:08:26.280030  2270 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.279791 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:41049 (local address 127.25.254.195:39719)
0504 14:08:26.279907 (+   116us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.279953 (+    46us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:41049: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":28}
I20260504 14:08:26.280941  2278 mvcc.cc:204] Tried to move back new op lower bound from 7282293580849418240 to 7282293580779294720. Current Snapshot: MvccSnapshot[applied={T|T < 7282293580849418240 or (T in {7282293580849418240})}]
I20260504 14:08:26.282043  1709 catalog_manager.cc:2507] Servicing SoftDeleteTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:35278:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:08:26.282238  1709 catalog_manager.cc:2755] Servicing DeleteTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:35278:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:08:26.287003  1709 catalog_manager.cc:5958] T 00000000000000000000000000000000 P b782998ffee94b86a80d60ed27ff59f1: Sending DeleteTablet for 3 replicas of tablet e9432a444c8a4a26ba2acc2f9bd88b10
I20260504 14:08:26.287964  1832 tablet_service.cc:1558] Processing DeleteTablet for tablet e9432a444c8a4a26ba2acc2f9bd88b10 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:26 UTC) from {username='oryx', principal='oryx/127.25.254.252@KRBTEST.COM'} at 127.0.0.1:50234
I20260504 14:08:26.288374  2109 tablet_service.cc:1558] Processing DeleteTablet for tablet e9432a444c8a4a26ba2acc2f9bd88b10 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:26 UTC) from {username='oryx', principal='oryx/127.25.254.252@KRBTEST.COM'} at 127.0.0.1:37366
I20260504 14:08:26.288477  2284 tablet_replica.cc:333] T e9432a444c8a4a26ba2acc2f9bd88b10 P 3162d5c41c52485598abe7a0a0eec507: stopping tablet replica
I20260504 14:08:26.288503  1973 tablet_service.cc:1558] Processing DeleteTablet for tablet e9432a444c8a4a26ba2acc2f9bd88b10 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:26 UTC) from {username='oryx', principal='oryx/127.25.254.252@KRBTEST.COM'} at 127.0.0.1:35392
I20260504 14:08:26.288669  2284 raft_consensus.cc:2243] T e9432a444c8a4a26ba2acc2f9bd88b10 P 3162d5c41c52485598abe7a0a0eec507 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:08:26.288904  2284 raft_consensus.cc:2272] T e9432a444c8a4a26ba2acc2f9bd88b10 P 3162d5c41c52485598abe7a0a0eec507 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:26.289099  2286 tablet_replica.cc:333] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0: stopping tablet replica
I20260504 14:08:26.289247  2286 raft_consensus.cc:2243] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:08:26.289311  2285 tablet_replica.cc:333] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10: stopping tablet replica
I20260504 14:08:26.289397  2286 raft_consensus.cc:2272] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:26.289415  2285 raft_consensus.cc:2243] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [term 1 LEADER]: Raft consensus shutting down.
I20260504 14:08:26.289647  2285 raft_consensus.cc:2272] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:26.290045 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 1773
I20260504 14:08:26.291306  2286 ts_tablet_manager.cc:1916] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:26.292507  2285 ts_tablet_manager.cc:1916] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:26.294373  2286 ts_tablet_manager.cc:1929] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:08:26.294471  2286 log.cc:1199] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-1/wal/wals/e9432a444c8a4a26ba2acc2f9bd88b10
I20260504 14:08:26.294695  2285 ts_tablet_manager.cc:1929] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:08:26.294780  2285 log.cc:1199] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipalMultipleMaster.1777903638260922-26619-0/minicluster-data/ts-2/wal/wals/e9432a444c8a4a26ba2acc2f9bd88b10
I20260504 14:08:26.294792  2286 ts_tablet_manager.cc:1950] T e9432a444c8a4a26ba2acc2f9bd88b10 P 455287b19bad4de4a0b2b1c878c6b1d0: Deleting consensus metadata
I20260504 14:08:26.295071  2285 ts_tablet_manager.cc:1950] T e9432a444c8a4a26ba2acc2f9bd88b10 P e7715d2595ab45af9b8b972e34e95e10: Deleting consensus metadata
I20260504 14:08:26.295727  1695 catalog_manager.cc:5002] TS 455287b19bad4de4a0b2b1c878c6b1d0 (127.25.254.194:44781): tablet e9432a444c8a4a26ba2acc2f9bd88b10 (table test-table [id=9180f532e41542bf8249a4d153579e93]) successfully deleted
I20260504 14:08:26.295925  1694 catalog_manager.cc:5002] TS e7715d2595ab45af9b8b972e34e95e10 (127.25.254.195:37043): tablet e9432a444c8a4a26ba2acc2f9bd88b10 (table test-table [id=9180f532e41542bf8249a4d153579e93]) successfully deleted
I20260504 14:08:26.299867 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 1909
I20260504 14:08:26.301079  2259 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:26.300793 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:46541 (local address 127.0.0.1:50238)
0504 14:08:26.300914 (+   121us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:26.300970 (+    56us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.193:46541: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":32}
W20260504 14:08:26.301857  1693 catalog_manager.cc:4729] TS 3162d5c41c52485598abe7a0a0eec507 (127.25.254.193:46541): DeleteTablet:TABLET_DATA_DELETED RPC failed for tablet e9432a444c8a4a26ba2acc2f9bd88b10: Network error: Client connection negotiation failed: client connection to 127.25.254.193:46541: connect: Connection refused (error 111)
I20260504 14:08:26.306721 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 2046
I20260504 14:08:26.314266 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 1377
I20260504 14:08:26.320564 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 1500
2026-05-04T14:08:26Z chronyd exiting
[       OK ] SecurityITest.TestNonDefaultPrincipalMultipleMaster (7635 ms)
[ RUN      ] SecurityITest.TestNonDefaultPrincipal
Loading random data
Initializing database '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/principal' for realm 'KRBTEST.COM',
master key name 'K/M@KRBTEST.COM'
May 04 14:08:26 dist-test-slave-2x32 krb5kdc[2289](info): setting up network...
krb5kdc: setsockopt(10,IPV6_V6ONLY,1) worked
May 04 14:08:26 dist-test-slave-2x32 krb5kdc[2289](info): set up 2 sockets
May 04 14:08:26 dist-test-slave-2x32 krb5kdc[2289](info): commencing operation
krb5kdc: starting...
W20260504 14:08:28.355055 26619 mini_kdc.cc:121] Time spent starting KDC: real 2.010s	user 0.000s	sys 0.006s
WARNING: no policy specified for test-admin@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-admin@KRBTEST.COM" created.
WARNING: no policy specified for test-user@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-user@KRBTEST.COM" created.
WARNING: no policy specified for joe-interloper@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "joe-interloper@KRBTEST.COM" created.
Authenticating as principal slave/admin@KRBTEST.COM with password.
Entry for principal test-user with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/test-user.keytab.
Entry for principal test-user with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/test-user.keytab.
May 04 14:08:28 dist-test-slave-2x32 krb5kdc[2289](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903708, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
2026-05-04T14:08:28Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:08:28Z Disabled control of system clock
WARNING: no policy specified for oryx/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "oryx/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal oryx/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal oryx/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab.
WARNING: no policy specified for HTTP/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab.
I20260504 14:08:28.509872 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:39351
--webserver_interface=127.25.254.254
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.254
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:46295
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:39351
--encrypt_data_at_rest=true
--rpc_trace_negotiation
--txn_manager_enabled=true with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:28.615339  2305 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:28.615631  2305 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:28.615756  2305 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:28.619184  2305 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:08:28.619290  2305 flags.cc:432] Enabled experimental flag: --txn_manager_enabled=true
W20260504 14:08:28.619326  2305 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:28.619375  2305 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:28.619421  2305 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:08:28.619478  2305 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:08:28.624106  2305 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:46295
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:39351
--txn_manager_enabled=true
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.254
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:39351
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.2305
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:28.625315  2305 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:28.626235  2305 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:28.632053  2311 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:28.632061  2310 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:28.632158  2313 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:28.632257  2305 server_base.cc:1061] running on GCE node
I20260504 14:08:28.633050  2305 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:28.634078  2305 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:28.635313  2305 hybrid_clock.cc:648] HybridClock initialized: now 1777903708635296 us; error 32 us; skew 500 ppm
May 04 14:08:28 dist-test-slave-2x32 krb5kdc[2289](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903708, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.254@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:28.638095  2305 init.cc:377] Logged in from keytab as oryx/127.25.254.254@KRBTEST.COM (short username oryx)
I20260504 14:08:28.639266  2305 webserver.cc:492] Webserver started at http://127.25.254.254:44771/ using document root <none> and password file <none>
I20260504 14:08:28.639842  2305 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:28.639894  2305 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:28.640100  2305 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:28.641782  2305 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "70b2453e2a074974bbdb6e971a688516"
format_stamp: "Formatted at 2026-05-04 14:08:28 on dist-test-slave-2x32"
server_key: "b903c6b89f0b3d43f1e04134d6864774"
server_key_iv: "f49a797ff78e48e52f6d50e1270edf8d"
server_key_version: "encryptionkey@0"
I20260504 14:08:28.642326  2305 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "70b2453e2a074974bbdb6e971a688516"
format_stamp: "Formatted at 2026-05-04 14:08:28 on dist-test-slave-2x32"
server_key: "b903c6b89f0b3d43f1e04134d6864774"
server_key_iv: "f49a797ff78e48e52f6d50e1270edf8d"
server_key_version: "encryptionkey@0"
I20260504 14:08:28.645814  2305 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.002s	sys 0.001s
I20260504 14:08:28.648326  2320 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:28.649422  2305 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.000s	sys 0.002s
I20260504 14:08:28.649528  2305 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "70b2453e2a074974bbdb6e971a688516"
format_stamp: "Formatted at 2026-05-04 14:08:28 on dist-test-slave-2x32"
server_key: "b903c6b89f0b3d43f1e04134d6864774"
server_key_iv: "f49a797ff78e48e52f6d50e1270edf8d"
server_key_version: "encryptionkey@0"
I20260504 14:08:28.649703  2305 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:28.663576  2305 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:28.666941  2305 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:28.667169  2305 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:28.676240  2305 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:39351
I20260504 14:08:28.676512  2382 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:39351 every 8 connection(s)
I20260504 14:08:28.677322  2305 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:08:28.680208  2383 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:28.685611  2383 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516: Bootstrap starting.
I20260504 14:08:28.686415 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 2305
I20260504 14:08:28.686540 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:08:28.686796 26619 external_mini_cluster.cc:1468] Setting key 9329ec92b5211769dbca6b1efcac6d5e
I20260504 14:08:28.688650  2383 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:28.689381  2383 log.cc:826] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:28.691255  2383 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516: No bootstrap required, opened a new log
I20260504 14:08:28.694535  2383 raft_consensus.cc:359] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "70b2453e2a074974bbdb6e971a688516" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 39351 } }
I20260504 14:08:28.694742  2383 raft_consensus.cc:385] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:28.694828  2383 raft_consensus.cc:740] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 70b2453e2a074974bbdb6e971a688516, State: Initialized, Role: FOLLOWER
I20260504 14:08:28.695286  2383 consensus_queue.cc:260] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "70b2453e2a074974bbdb6e971a688516" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 39351 } }
I20260504 14:08:28.695403  2383 raft_consensus.cc:399] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:08:28.695487  2383 raft_consensus.cc:493] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:08:28.695581  2383 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:28.696966  2383 raft_consensus.cc:515] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "70b2453e2a074974bbdb6e971a688516" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 39351 } }
I20260504 14:08:28.697367  2383 leader_election.cc:304] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 70b2453e2a074974bbdb6e971a688516; no voters: 
I20260504 14:08:28.697726  2383 leader_election.cc:290] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:08:28.697818  2388 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:08:28.697992  2388 raft_consensus.cc:697] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516 [term 1 LEADER]: Becoming Leader. State: Replica: 70b2453e2a074974bbdb6e971a688516, State: Running, Role: LEADER
I20260504 14:08:28.698309  2388 consensus_queue.cc:237] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "70b2453e2a074974bbdb6e971a688516" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 39351 } }
May 04 14:08:28 dist-test-slave-2x32 krb5kdc[2289](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903708, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM
I20260504 14:08:28.699038  2383 sys_catalog.cc:565] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516 [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:08:28.699982  2390 sys_catalog.cc:455] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "70b2453e2a074974bbdb6e971a688516" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "70b2453e2a074974bbdb6e971a688516" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 39351 } } }
I20260504 14:08:28.700024  2389 sys_catalog.cc:455] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 70b2453e2a074974bbdb6e971a688516. Latest consensus state: current_term: 1 leader_uuid: "70b2453e2a074974bbdb6e971a688516" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "70b2453e2a074974bbdb6e971a688516" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 39351 } } }
I20260504 14:08:28.700106  2390 sys_catalog.cc:458] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516 [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:28.700136  2389 sys_catalog.cc:458] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516 [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:28.700513  2397 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:08:28.704003  2397 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:08:28.705597  2386 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:28.688077 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:55214 (local address 127.25.254.254:39351)
0504 14:08:28.688536 (+   459us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:28.688546 (+    10us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:28.688578 (+    32us) server_negotiation.cc:408] Connection header received
0504 14:08:28.689181 (+   603us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:28.689199 (+    18us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:28.689543 (+   344us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:28.689882 (+   339us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:28.690845 (+   963us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:28.691643 (+   798us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:28.692358 (+   715us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:28.692649 (+   291us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:28.700423 (+  7774us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:28.700449 (+    26us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:28.700464 (+    15us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:28.700502 (+    38us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:28.702980 (+  2478us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:28.703475 (+   495us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:28.703481 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:28.703487 (+     6us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:28.703567 (+    80us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:28.703836 (+   269us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:28.703839 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:28.703841 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:28.704225 (+   384us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:28.704441 (+   216us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:28.704749 (+   308us) server_negotiation.cc:300] Negotiation successful
0504 14:08:28.705042 (+   293us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":320,"thread_start_us":153,"threads_started":1}
I20260504 14:08:28.709425  2397 catalog_manager.cc:1357] Generated new cluster ID: 7877e19b96aa41cb845ff49fe481971c
I20260504 14:08:28.709496  2397 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:08:28.724015  2397 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:08:28.725240  2397 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:08:28.734483  2397 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516: Generated new TSK 0
I20260504 14:08:28.735136  2397 catalog_manager.cc:1524] Initializing in-progress tserver states...
WARNING: no policy specified for oryx/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "oryx/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal oryx/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal oryx/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab.
WARNING: no policy specified for HTTP/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab.
I20260504 14:08:28.794121 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:0
--local_ip_for_outbound_sockets=127.25.254.193
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:39351
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.193
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:46295
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--enable_txn_system_client_init=true with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:28.903610  2411 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:28.903843  2411 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:28.903903  2411 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:28.907253  2411 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:28.907320  2411 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:28.907372  2411 flags.cc:432] Enabled experimental flag: --enable_txn_system_client_init=true
W20260504 14:08:28.907414  2411 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:08:28.911878  2411 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:46295
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.193
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=0
--webserver_require_spnego=true
--enable_txn_system_client_init=true
--tserver_master_addrs=127.25.254.254:39351
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.2411
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:28.912932  2411 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:28.913755  2411 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:28.920145  2417 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:28.920169  2416 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:28.920168  2419 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:28.920922  2411 server_base.cc:1061] running on GCE node
I20260504 14:08:28.921306  2411 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:28.921854  2411 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:28.923069  2411 hybrid_clock.cc:648] HybridClock initialized: now 1777903708923028 us; error 57 us; skew 500 ppm
May 04 14:08:28 dist-test-slave-2x32 krb5kdc[2289](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903708, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.193@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:28.925729  2411 init.cc:377] Logged in from keytab as oryx/127.25.254.193@KRBTEST.COM (short username oryx)
I20260504 14:08:28.926967  2411 webserver.cc:492] Webserver started at http://127.25.254.193:32849/ using document root <none> and password file <none>
I20260504 14:08:28.927570  2411 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:28.927649  2411 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:28.927893  2411 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:28.930059  2411 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/data/instance:
uuid: "f04ea53f457241c9aa321af4ab8e451a"
format_stamp: "Formatted at 2026-05-04 14:08:28 on dist-test-slave-2x32"
server_key: "e432b6d061804358fc7176898247ceaa"
server_key_iv: "a503afad7892751318c508d47f2a86d4"
server_key_version: "encryptionkey@0"
I20260504 14:08:28.930685  2411 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance:
uuid: "f04ea53f457241c9aa321af4ab8e451a"
format_stamp: "Formatted at 2026-05-04 14:08:28 on dist-test-slave-2x32"
server_key: "e432b6d061804358fc7176898247ceaa"
server_key_iv: "a503afad7892751318c508d47f2a86d4"
server_key_version: "encryptionkey@0"
I20260504 14:08:28.934103  2411 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.003s	sys 0.000s
I20260504 14:08:28.936460  2426 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:28.937510  2411 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.000s	sys 0.002s
I20260504 14:08:28.937750  2411 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "f04ea53f457241c9aa321af4ab8e451a"
format_stamp: "Formatted at 2026-05-04 14:08:28 on dist-test-slave-2x32"
server_key: "e432b6d061804358fc7176898247ceaa"
server_key_iv: "a503afad7892751318c508d47f2a86d4"
server_key_version: "encryptionkey@0"
I20260504 14:08:28.937883  2411 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:28.962589  2411 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:28.965638  2411 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:28.965864  2411 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:28.967520  2411 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:28.967609  2411 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:28.967679  2411 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:28.967727  2411 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
May 04 14:08:28 dist-test-slave-2x32 krb5kdc[2289](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903708, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM
I20260504 14:08:28.980614  2411 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:33293
I20260504 14:08:28.981803  2411 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
I20260504 14:08:28.982220  2386 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:28.968370 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:38117 (local address 127.25.254.254:39351)
0504 14:08:28.968504 (+   134us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:28.968507 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:28.969473 (+   966us) server_negotiation.cc:408] Connection header received
0504 14:08:28.970813 (+  1340us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:28.970816 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:28.970882 (+    66us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:28.971000 (+   118us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:28.972606 (+  1606us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:28.973472 (+   866us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:28.974547 (+  1075us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:28.974733 (+   186us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:28.977693 (+  2960us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:28.977724 (+    31us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:28.977727 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:28.977790 (+    63us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:28.979757 (+  1967us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:28.980375 (+   618us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:28.980378 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:28.980379 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:28.980460 (+    81us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:28.980943 (+   483us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:28.980947 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:28.980949 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:28.981125 (+   176us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:28.981227 (+   102us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:28.981895 (+   668us) server_negotiation.cc:300] Negotiation successful
0504 14:08:28.982045 (+   150us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":45}
I20260504 14:08:28.982519  2541 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:33293 every 8 connection(s)
I20260504 14:08:28.982931  2435 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:28.968770 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:39351 (local address 127.25.254.193:38117)
0504 14:08:28.969317 (+   547us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:28.969362 (+    45us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:28.970555 (+  1193us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:28.971174 (+   619us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:28.971185 (+    11us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:28.971763 (+   578us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:28.972454 (+   691us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:28.972471 (+    17us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:28.973583 (+  1112us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:28.973587 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:28.974421 (+   834us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:28.974435 (+    14us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:28.974657 (+   222us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:28.975331 (+   674us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:28.975357 (+    26us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:28.977506 (+  2149us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:28.979936 (+  2430us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:28.979942 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:28.979952 (+    10us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:28.980199 (+   247us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:28.980657 (+   458us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:28.980662 (+     5us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:28.980664 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:28.980793 (+   129us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:28.981251 (+   458us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:28.981257 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:28.981554 (+   297us) client_negotiation.cc:770] Sending connection context
0504 14:08:28.981789 (+   235us) client_negotiation.cc:241] Negotiation successful
0504 14:08:28.982070 (+   281us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":313,"thread_start_us":142,"threads_started":1}
I20260504 14:08:28.984524  2542 heartbeater.cc:344] Connected to a master server at 127.25.254.254:39351
I20260504 14:08:28.984717  2542 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:28.985311  2542 heartbeater.cc:507] Master 127.25.254.254:39351 requested a full tablet report, sending...
I20260504 14:08:28.987082  2336 ts_manager.cc:194] Registered new tserver with Master: f04ea53f457241c9aa321af4ab8e451a (127.25.254.193:33293)
I20260504 14:08:28.988674  2336 master_service.cc:502] Signed X509 certificate for tserver {username='oryx', principal='oryx/127.25.254.193@KRBTEST.COM'} at 127.25.254.193:38117
I20260504 14:08:28.990971 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 2411
I20260504 14:08:28.991102 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance
I20260504 14:08:28.991433 26619 external_mini_cluster.cc:1468] Setting key ce189cfa4baa6972d65b5ca3a86de480
I20260504 14:08:28.994601  2548 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:28.987165 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:39351 (local address 127.25.254.193:51677)
0504 14:08:28.987526 (+   361us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:28.987541 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:28.987669 (+   128us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:28.988000 (+   331us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:28.988003 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:28.988221 (+   218us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:28.988506 (+   285us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:28.988512 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:28.989401 (+   889us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:28.989408 (+     7us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:28.990022 (+   614us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:28.990032 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:28.990181 (+   149us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:28.990720 (+   539us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:28.990737 (+    17us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:28.991130 (+   393us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:28.993087 (+  1957us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:28.993091 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:28.993093 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:28.993322 (+   229us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:28.993579 (+   257us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:28.993582 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:28.993583 (+     1us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:28.993630 (+    47us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:28.994085 (+   455us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:28.994088 (+     3us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:28.994243 (+   155us) client_negotiation.cc:770] Sending connection context
0504 14:08:28.994344 (+   101us) client_negotiation.cc:241] Negotiation successful
0504 14:08:28.994459 (+   115us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":275,"thread_start_us":113,"threads_started":1}
I20260504 14:08:28.994778  2386 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:28.987229 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:51677 (local address 127.25.254.254:39351)
0504 14:08:28.987375 (+   146us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:28.987379 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:28.987649 (+   270us) server_negotiation.cc:408] Connection header received
0504 14:08:28.987796 (+   147us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:28.987799 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:28.987847 (+    48us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:28.987950 (+   103us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:28.988636 (+   686us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:28.989230 (+   594us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:28.990205 (+   975us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:28.990349 (+   144us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:28.991280 (+   931us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:28.991302 (+    22us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:28.991306 (+     4us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:28.991335 (+    29us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:28.992878 (+  1543us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:28.993426 (+   548us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:28.993430 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:28.993432 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:28.993481 (+    49us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:28.993731 (+   250us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:28.993737 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:28.993739 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:28.993934 (+   195us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:28.994041 (+   107us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:28.994422 (+   381us) server_negotiation.cc:300] Negotiation successful
0504 14:08:28.994554 (+   132us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":62}
WARNING: no policy specified for oryx/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "oryx/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal oryx/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal oryx/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab.
WARNING: no policy specified for HTTP/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab.
I20260504 14:08:29.049285 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:0
--local_ip_for_outbound_sockets=127.25.254.194
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:39351
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.194
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:46295
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--enable_txn_system_client_init=true with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:29.155347  2553 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:29.155579  2553 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:29.155646  2553 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:29.159031  2553 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:29.159101  2553 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:29.159155  2553 flags.cc:432] Enabled experimental flag: --enable_txn_system_client_init=true
W20260504 14:08:29.159197  2553 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:08:29.163504  2553 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:46295
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.194
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=0
--webserver_require_spnego=true
--enable_txn_system_client_init=true
--tserver_master_addrs=127.25.254.254:39351
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.2553
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:29.164599  2553 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:29.165386  2553 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:29.172046  2561 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:29.172048  2558 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:29.172048  2559 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:29.172483  2553 server_base.cc:1061] running on GCE node
I20260504 14:08:29.172953  2553 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:29.173521  2553 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:29.174789  2553 hybrid_clock.cc:648] HybridClock initialized: now 1777903709174764 us; error 38 us; skew 500 ppm
May 04 14:08:29 dist-test-slave-2x32 krb5kdc[2289](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903709, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.194@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:29.177717  2553 init.cc:377] Logged in from keytab as oryx/127.25.254.194@KRBTEST.COM (short username oryx)
I20260504 14:08:29.178939  2553 webserver.cc:492] Webserver started at http://127.25.254.194:42919/ using document root <none> and password file <none>
I20260504 14:08:29.179538  2553 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:29.179615  2553 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:29.179950  2553 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:29.181726  2553 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/data/instance:
uuid: "b2991172abc5408aa043125b7f6941f9"
format_stamp: "Formatted at 2026-05-04 14:08:29 on dist-test-slave-2x32"
server_key: "7605747540c3b37b9593b43ffeb3e309"
server_key_iv: "3593c215f736bb9c30f011b542c05cb1"
server_key_version: "encryptionkey@0"
I20260504 14:08:29.182309  2553 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance:
uuid: "b2991172abc5408aa043125b7f6941f9"
format_stamp: "Formatted at 2026-05-04 14:08:29 on dist-test-slave-2x32"
server_key: "7605747540c3b37b9593b43ffeb3e309"
server_key_iv: "3593c215f736bb9c30f011b542c05cb1"
server_key_version: "encryptionkey@0"
I20260504 14:08:29.185964  2553 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.003s	sys 0.000s
I20260504 14:08:29.188370  2568 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:29.189594  2553 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:08:29.189824  2553 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "b2991172abc5408aa043125b7f6941f9"
format_stamp: "Formatted at 2026-05-04 14:08:29 on dist-test-slave-2x32"
server_key: "7605747540c3b37b9593b43ffeb3e309"
server_key_iv: "3593c215f736bb9c30f011b542c05cb1"
server_key_version: "encryptionkey@0"
I20260504 14:08:29.189946  2553 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:29.213692  2553 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:29.216782  2553 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:29.217010  2553 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:29.218644  2553 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:29.218714  2553 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:29.218765  2553 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:29.218798  2553 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
May 04 14:08:29 dist-test-slave-2x32 krb5kdc[2289](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903709, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM
I20260504 14:08:29.231364  2553 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:44849
I20260504 14:08:29.231491  2683 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:44849 every 8 connection(s)
I20260504 14:08:29.232456  2553 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
I20260504 14:08:29.233122  2386 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.219195 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:38237 (local address 127.25.254.254:39351)
0504 14:08:29.219345 (+   150us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:29.219350 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:29.221403 (+  2053us) server_negotiation.cc:408] Connection header received
0504 14:08:29.222699 (+  1296us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:29.222702 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:29.222766 (+    64us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:29.222889 (+   123us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:29.224739 (+  1850us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.225557 (+   818us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.226278 (+   721us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.226453 (+   175us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.228940 (+  2487us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:29.228960 (+    20us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:29.228963 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:29.229012 (+    49us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:29.230930 (+  1918us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:29.231549 (+   619us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:29.231554 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:29.231558 (+     4us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:29.231619 (+    61us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:29.231927 (+   308us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:29.231930 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:29.231932 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:29.232162 (+   230us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:29.232287 (+   125us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:29.232808 (+   521us) server_negotiation.cc:300] Negotiation successful
0504 14:08:29.232966 (+   158us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":49}
I20260504 14:08:29.233759  2577 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.220736 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:39351 (local address 127.25.254.194:38237)
0504 14:08:29.221232 (+   496us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:29.221265 (+    33us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:29.222455 (+  1190us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:29.223190 (+   735us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:29.223211 (+    21us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:29.223818 (+   607us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:29.224549 (+   731us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.224568 (+    19us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.225694 (+  1126us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.225698 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:29.226103 (+   405us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.226110 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.226376 (+   266us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.227163 (+   787us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:29.227191 (+    28us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:29.228737 (+  1546us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:29.231064 (+  2327us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:29.231071 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:29.231082 (+    11us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:29.231420 (+   338us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:29.231737 (+   317us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:29.231739 (+     2us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:29.231741 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:29.231824 (+    83us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:29.232297 (+   473us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:29.232306 (+     9us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:29.232513 (+   207us) client_negotiation.cc:770] Sending connection context
0504 14:08:29.232758 (+   245us) client_negotiation.cc:241] Negotiation successful
0504 14:08:29.232984 (+   226us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":315,"thread_start_us":130,"threads_started":1}
I20260504 14:08:29.234875  2684 heartbeater.cc:344] Connected to a master server at 127.25.254.254:39351
I20260504 14:08:29.235057  2684 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:29.235600 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 2553
I20260504 14:08:29.235633  2684 heartbeater.cc:507] Master 127.25.254.254:39351 requested a full tablet report, sending...
I20260504 14:08:29.235723 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance
I20260504 14:08:29.236011 26619 external_mini_cluster.cc:1468] Setting key 5c2f5e5f6ae99951bfb99e15d499c923
I20260504 14:08:29.237098  2336 ts_manager.cc:194] Registered new tserver with Master: b2991172abc5408aa043125b7f6941f9 (127.25.254.194:44849)
I20260504 14:08:29.237813  2336 master_service.cc:502] Signed X509 certificate for tserver {username='oryx', principal='oryx/127.25.254.194@KRBTEST.COM'} at 127.25.254.194:38237
I20260504 14:08:29.244524  2386 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.237174 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:49799 (local address 127.25.254.254:39351)
0504 14:08:29.237293 (+   119us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:29.237297 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:29.237591 (+   294us) server_negotiation.cc:408] Connection header received
0504 14:08:29.237736 (+   145us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:29.237739 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:29.237784 (+    45us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:29.237866 (+    82us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:29.238582 (+   716us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.239080 (+   498us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.239985 (+   905us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.240160 (+   175us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.241276 (+  1116us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:29.241296 (+    20us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:29.241300 (+     4us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:29.241340 (+    40us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:29.242953 (+  1613us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:29.243415 (+   462us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:29.243417 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:29.243419 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:29.243459 (+    40us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:29.243739 (+   280us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:29.243742 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:29.243744 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:29.243923 (+   179us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:29.244008 (+    85us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:29.244316 (+   308us) server_negotiation.cc:300] Negotiation successful
0504 14:08:29.244403 (+    87us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":40}
I20260504 14:08:29.244612  2690 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.237097 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:39351 (local address 127.25.254.194:49799)
0504 14:08:29.237509 (+   412us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:29.237521 (+    12us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:29.237637 (+   116us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:29.237882 (+   245us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:29.237885 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:29.238186 (+   301us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:29.238461 (+   275us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.238474 (+    13us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.239229 (+   755us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.239235 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:29.239852 (+   617us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.239864 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.239995 (+   131us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.240651 (+   656us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:29.240673 (+    22us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:29.241106 (+   433us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:29.243078 (+  1972us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:29.243084 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:29.243088 (+     4us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:29.243316 (+   228us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:29.243547 (+   231us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:29.243551 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:29.243553 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:29.243617 (+    64us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:29.244030 (+   413us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:29.244033 (+     3us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:29.244137 (+   104us) client_negotiation.cc:770] Sending connection context
0504 14:08:29.244262 (+   125us) client_negotiation.cc:241] Negotiation successful
0504 14:08:29.244400 (+   138us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":339,"thread_start_us":100,"threads_started":1}
WARNING: no policy specified for oryx/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "oryx/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal oryx/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal oryx/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab.
WARNING: no policy specified for HTTP/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab.
I20260504 14:08:29.295126 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:0
--local_ip_for_outbound_sockets=127.25.254.195
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:39351
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.195
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:46295
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--enable_txn_system_client_init=true with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:29.399843  2695 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:29.400087  2695 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:29.400146  2695 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:29.403687  2695 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:29.403759  2695 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:29.403812  2695 flags.cc:432] Enabled experimental flag: --enable_txn_system_client_init=true
W20260504 14:08:29.403854  2695 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:08:29.408061  2695 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:46295
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/krb5kdc/oryx.keytab
--principal=oryx/127.25.254.195
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=0
--webserver_require_spnego=true
--enable_txn_system_client_init=true
--tserver_master_addrs=127.25.254.254:39351
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.2695
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:29.409126  2695 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:29.409938  2695 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:29.416718  2703 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:29.416790  2701 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:29.416723  2700 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:29.416992  2695 server_base.cc:1061] running on GCE node
I20260504 14:08:29.417408  2695 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:29.417982  2695 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:29.419183  2695 hybrid_clock.cc:648] HybridClock initialized: now 1777903709419157 us; error 39 us; skew 500 ppm
May 04 14:08:29 dist-test-slave-2x32 krb5kdc[2289](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903709, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.195@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:29.422241  2695 init.cc:377] Logged in from keytab as oryx/127.25.254.195@KRBTEST.COM (short username oryx)
I20260504 14:08:29.423486  2695 webserver.cc:492] Webserver started at http://127.25.254.195:40279/ using document root <none> and password file <none>
I20260504 14:08:29.424317  2695 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:29.424438  2695 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:29.424764  2695 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:29.426726  2695 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/data/instance:
uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc"
format_stamp: "Formatted at 2026-05-04 14:08:29 on dist-test-slave-2x32"
server_key: "c0ae8e686f97abe3b4612531c6b9b67a"
server_key_iv: "4c26c8e9f936ae4560a34c4bc8cd147c"
server_key_version: "encryptionkey@0"
I20260504 14:08:29.427238  2695 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance:
uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc"
format_stamp: "Formatted at 2026-05-04 14:08:29 on dist-test-slave-2x32"
server_key: "c0ae8e686f97abe3b4612531c6b9b67a"
server_key_iv: "4c26c8e9f936ae4560a34c4bc8cd147c"
server_key_version: "encryptionkey@0"
I20260504 14:08:29.430982  2695 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.003s	sys 0.000s
I20260504 14:08:29.433280  2710 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:29.434494  2695 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:08:29.434677  2695 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc"
format_stamp: "Formatted at 2026-05-04 14:08:29 on dist-test-slave-2x32"
server_key: "c0ae8e686f97abe3b4612531c6b9b67a"
server_key_iv: "4c26c8e9f936ae4560a34c4bc8cd147c"
server_key_version: "encryptionkey@0"
I20260504 14:08:29.434793  2695 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:29.450551  2695 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:29.453691  2695 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:29.453917  2695 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:29.455554  2695 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:29.455646  2695 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:29.455724  2695 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:29.455770  2695 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
May 04 14:08:29 dist-test-slave-2x32 krb5kdc[2289](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903709, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM
I20260504 14:08:29.467389  2695 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:44789
I20260504 14:08:29.467410  2825 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:44789 every 8 connection(s)
I20260504 14:08:29.468326  2695 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
I20260504 14:08:29.469148  2386 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.456415 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:36535 (local address 127.25.254.254:39351)
0504 14:08:29.456547 (+   132us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:29.456550 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:29.457354 (+   804us) server_negotiation.cc:408] Connection header received
0504 14:08:29.458293 (+   939us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:29.458297 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:29.458360 (+    63us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:29.458461 (+   101us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:29.460413 (+  1952us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.461418 (+  1005us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.462458 (+  1040us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.462673 (+   215us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.465169 (+  2496us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:29.465196 (+    27us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:29.465199 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:29.465224 (+    25us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:29.467072 (+  1848us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:29.467680 (+   608us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:29.467685 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:29.467687 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:29.467750 (+    63us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:29.468072 (+   322us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:29.468075 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:29.468077 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:29.468234 (+   157us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:29.468339 (+   105us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:29.468862 (+   523us) server_negotiation.cc:300] Negotiation successful
0504 14:08:29.468987 (+   125us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":43}
I20260504 14:08:29.469769  2719 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.456760 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:39351 (local address 127.25.254.195:36535)
0504 14:08:29.457219 (+   459us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:29.457251 (+    32us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:29.458029 (+   778us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:29.458620 (+   591us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:29.458628 (+     8us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:29.459234 (+   606us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:29.460228 (+   994us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.460266 (+    38us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.461606 (+  1340us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.461612 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:29.462310 (+   698us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.462323 (+    13us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.462560 (+   237us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.463210 (+   650us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:29.463237 (+    27us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:29.465001 (+  1764us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:29.467201 (+  2200us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:29.467206 (+     5us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:29.467220 (+    14us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:29.467551 (+   331us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:29.467851 (+   300us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:29.467854 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:29.467856 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:29.467967 (+   111us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:29.468339 (+   372us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:29.468346 (+     7us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:29.468617 (+   271us) client_negotiation.cc:770] Sending connection context
0504 14:08:29.468814 (+   197us) client_negotiation.cc:241] Negotiation successful
0504 14:08:29.469036 (+   222us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":298,"thread_start_us":143,"threads_started":1}
I20260504 14:08:29.470896  2826 heartbeater.cc:344] Connected to a master server at 127.25.254.254:39351
I20260504 14:08:29.471078  2826 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:29.471710 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 2695
I20260504 14:08:29.471735  2826 heartbeater.cc:507] Master 127.25.254.254:39351 requested a full tablet report, sending...
I20260504 14:08:29.471840 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance
I20260504 14:08:29.472131 26619 external_mini_cluster.cc:1468] Setting key ea84a44245bd81c99e4b0f1bec939c50
I20260504 14:08:29.472795  2337 ts_manager.cc:194] Registered new tserver with Master: 3fc252a5c8fc4f69b1c0a18dde0aaddc (127.25.254.195:44789)
I20260504 14:08:29.473562  2337 master_service.cc:502] Signed X509 certificate for tserver {username='oryx', principal='oryx/127.25.254.195@KRBTEST.COM'} at 127.25.254.195:36535
I20260504 14:08:29.474961 26619 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
I20260504 14:08:29.480473  2832 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.473307 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:39351 (local address 127.25.254.195:51117)
0504 14:08:29.473672 (+   365us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:29.473687 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:29.473785 (+    98us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:29.474147 (+   362us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:29.474149 (+     2us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:29.474366 (+   217us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:29.474629 (+   263us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.474640 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.475572 (+   932us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.475575 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:29.475984 (+   409us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.475992 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.476087 (+    95us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.476646 (+   559us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:29.476660 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:29.476951 (+   291us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:29.479002 (+  2051us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:29.479009 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:29.479012 (+     3us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:29.479237 (+   225us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:29.479517 (+   280us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:29.479521 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:29.479523 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:29.479588 (+    65us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:29.480074 (+   486us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:29.480077 (+     3us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:29.480152 (+    75us) client_negotiation.cc:770] Sending connection context
0504 14:08:29.480237 (+    85us) client_negotiation.cc:241] Negotiation successful
0504 14:08:29.480334 (+    97us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":288,"thread_start_us":97,"threads_started":1}
I20260504 14:08:29.480756  2386 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.473376 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:51117 (local address 127.25.254.254:39351)
0504 14:08:29.473509 (+   133us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:29.473514 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:29.473766 (+   252us) server_negotiation.cc:408] Connection header received
0504 14:08:29.473904 (+   138us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:29.473907 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:29.473968 (+    61us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:29.474010 (+    42us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:29.474750 (+   740us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.475435 (+   685us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.476106 (+   671us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.476330 (+   224us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.477109 (+   779us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:29.477130 (+    21us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:29.477135 (+     5us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:29.477169 (+    34us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:29.478870 (+  1701us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:29.479362 (+   492us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:29.479366 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:29.479367 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:29.479413 (+    46us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:29.479765 (+   352us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:29.479771 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:29.479774 (+     3us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:29.479966 (+   192us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:29.480080 (+   114us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:29.480334 (+   254us) server_negotiation.cc:300] Negotiation successful
0504 14:08:29.480455 (+   121us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":51}
May 04 14:08:29 dist-test-slave-2x32 krb5kdc[2289](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903709, etypes {rep=17 tkt=17 ses=17}, test-user@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-user@KRBTEST.COM: 
May 04 14:08:29 dist-test-slave-2x32 krb5kdc[2289](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  test-user@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:29 dist-test-slave-2x32 krb5kdc[2289](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  test-user@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
W20260504 14:08:29.499392  2386 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:29.493279 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:55226 (local address 127.25.254.254:39351)
0504 14:08:29.493434 (+   155us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:29.493438 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:29.493452 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:08:29.493497 (+    45us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:29.493500 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:29.493560 (+    60us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:29.493696 (+   136us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:29.494668 (+   972us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.495213 (+   545us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.495967 (+   754us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.496201 (+   234us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.499278 (+  3077us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.0.0.1:55226: BlockingRecv error: recv got EOF from 127.0.0.1:55226 (error 108)
Metrics: {"server-negotiator.queue_time_us":67}
May 04 14:08:29 dist-test-slave-2x32 krb5kdc[2289](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903709, etypes {rep=17 tkt=17 ses=17}, test-user@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM
I20260504 14:08:29.509222  2386 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.501206 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:55228 (local address 127.25.254.254:39351)
0504 14:08:29.501388 (+   182us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:29.501392 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:29.501532 (+   140us) server_negotiation.cc:408] Connection header received
0504 14:08:29.501726 (+   194us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:29.501729 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:29.501773 (+    44us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:29.501860 (+    87us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:29.502660 (+   800us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.503144 (+   484us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.503771 (+   627us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.503973 (+   202us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.506214 (+  2241us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:29.506239 (+    25us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:29.506241 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:29.506268 (+    27us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:29.507647 (+  1379us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:29.508071 (+   424us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:29.508075 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:29.508077 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:29.508125 (+    48us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:29.508363 (+   238us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:29.508367 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:29.508369 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:29.508522 (+   153us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:29.508659 (+   137us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:29.508932 (+   273us) server_negotiation.cc:300] Negotiation successful
0504 14:08:29.509045 (+   113us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":64}
I20260504 14:08:29.512147  2337 catalog_manager.cc:2257] Servicing CreateTable request from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:55228:
name: "test-table"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "val"
    type: INT32
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
W20260504 14:08:29.514627  2337 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-table in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260504 14:08:29.528858  2848 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.524332 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:44849 (local address 127.0.0.1:49956)
0504 14:08:29.524878 (+   546us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:29.524918 (+    40us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:29.525082 (+   164us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:29.525611 (+   529us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:29.525616 (+     5us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:29.525637 (+    21us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:29.525972 (+   335us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.525982 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.527449 (+  1467us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.527452 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:29.528389 (+   937us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.528396 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.528493 (+    97us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.528546 (+    53us) client_negotiation.cc:770] Sending connection context
0504 14:08:29.528627 (+    81us) client_negotiation.cc:241] Negotiation successful
0504 14:08:29.528708 (+    81us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":386,"thread_start_us":107,"threads_started":1}
I20260504 14:08:29.529446  2849 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.524496 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:49956 (local address 127.25.254.194:44849)
0504 14:08:29.524844 (+   348us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:29.524850 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:29.525075 (+   225us) server_negotiation.cc:408] Connection header received
0504 14:08:29.525267 (+   192us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:29.525273 (+     6us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:29.525470 (+   197us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:29.525603 (+   133us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:29.526107 (+   504us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.527327 (+  1220us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.528527 (+  1200us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.529048 (+   521us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.529165 (+   117us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:29.529229 (+    64us) server_negotiation.cc:300] Negotiation successful
0504 14:08:29.529305 (+    76us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":238,"thread_start_us":136,"threads_started":1}
I20260504 14:08:29.529536  2852 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.525401 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:33293 (local address 127.0.0.1:57610)
0504 14:08:29.525835 (+   434us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:29.525848 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:29.525938 (+    90us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:29.526677 (+   739us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:29.526680 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:29.526700 (+    20us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:29.526973 (+   273us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.526980 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.528365 (+  1385us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.528369 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:29.529199 (+   830us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.529209 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.529294 (+    85us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.529308 (+    14us) client_negotiation.cc:770] Sending connection context
0504 14:08:29.529351 (+    43us) client_negotiation.cc:241] Negotiation successful
0504 14:08:29.529412 (+    61us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":372,"thread_start_us":92,"threads_started":1}
I20260504 14:08:29.529913  2850 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.525069 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:44789 (local address 127.0.0.1:51874)
0504 14:08:29.525363 (+   294us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:29.525377 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:29.525475 (+    98us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:29.526569 (+  1094us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:29.526572 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:29.526608 (+    36us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:29.526854 (+   246us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.526860 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.527997 (+  1137us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.528003 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:29.529535 (+  1532us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.529549 (+    14us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.529666 (+   117us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.529693 (+    27us) client_negotiation.cc:770] Sending connection context
0504 14:08:29.529746 (+    53us) client_negotiation.cc:241] Negotiation successful
0504 14:08:29.529803 (+    57us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":220,"thread_start_us":101,"threads_started":1}
I20260504 14:08:29.530606  2853 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.525803 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:57610 (local address 127.25.254.193:33293)
0504 14:08:29.526255 (+   452us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:29.526261 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:29.526283 (+    22us) server_negotiation.cc:408] Connection header received
0504 14:08:29.526358 (+    75us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:29.526363 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:29.526514 (+   151us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:29.527112 (+   598us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:29.527230 (+   118us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.528238 (+  1008us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.529470 (+  1232us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.530241 (+   771us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.530341 (+   100us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:29.530413 (+    72us) server_negotiation.cc:300] Negotiation successful
0504 14:08:29.530502 (+    89us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":328,"thread_start_us":79,"threads_started":1}
I20260504 14:08:29.530773  2851 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.525340 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:51874 (local address 127.25.254.195:44789)
0504 14:08:29.526222 (+   882us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:29.526229 (+     7us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:29.526256 (+    27us) server_negotiation.cc:408] Connection header received
0504 14:08:29.526319 (+    63us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:29.526322 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:29.526435 (+   113us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:29.526582 (+   147us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:29.526981 (+   399us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.527876 (+   895us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.529802 (+  1926us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.530417 (+   615us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.530520 (+   103us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:29.530582 (+    62us) server_negotiation.cc:300] Negotiation successful
0504 14:08:29.530656 (+    74us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":749,"thread_start_us":101,"threads_started":1}
I20260504 14:08:29.532744  2618 tablet_service.cc:1511] Processing CreateTablet for tablet 04c0e2f71d864d0eb15d48581d1f701b (DEFAULT_TABLE table=test-table [id=19ff76d95b2b426b851b0c551fdbf9f5]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:29.533073  2760 tablet_service.cc:1511] Processing CreateTablet for tablet 04c0e2f71d864d0eb15d48581d1f701b (DEFAULT_TABLE table=test-table [id=19ff76d95b2b426b851b0c551fdbf9f5]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:29.533073  2476 tablet_service.cc:1511] Processing CreateTablet for tablet 04c0e2f71d864d0eb15d48581d1f701b (DEFAULT_TABLE table=test-table [id=19ff76d95b2b426b851b0c551fdbf9f5]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:29.533670  2618 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 04c0e2f71d864d0eb15d48581d1f701b. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:29.533955  2476 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 04c0e2f71d864d0eb15d48581d1f701b. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:29.533949  2760 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 04c0e2f71d864d0eb15d48581d1f701b. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:29.539762  2855 tablet_bootstrap.cc:492] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a: Bootstrap starting.
I20260504 14:08:29.540323  2856 tablet_bootstrap.cc:492] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Bootstrap starting.
I20260504 14:08:29.540684  2854 tablet_bootstrap.cc:492] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9: Bootstrap starting.
I20260504 14:08:29.541946  2855 tablet_bootstrap.cc:654] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:29.542657  2856 tablet_bootstrap.cc:654] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:29.542776  2855 log.cc:826] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:29.543947  2856 log.cc:826] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:29.544242  2854 tablet_bootstrap.cc:654] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:29.545339  2854 log.cc:826] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:29.545740  2855 tablet_bootstrap.cc:492] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a: No bootstrap required, opened a new log
I20260504 14:08:29.545737  2856 tablet_bootstrap.cc:492] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc: No bootstrap required, opened a new log
I20260504 14:08:29.545943  2855 ts_tablet_manager.cc:1403] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a: Time spent bootstrapping tablet: real 0.006s	user 0.004s	sys 0.000s
I20260504 14:08:29.545969  2856 ts_tablet_manager.cc:1403] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Time spent bootstrapping tablet: real 0.006s	user 0.004s	sys 0.000s
I20260504 14:08:29.547384  2854 tablet_bootstrap.cc:492] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9: No bootstrap required, opened a new log
I20260504 14:08:29.547590  2854 ts_tablet_manager.cc:1403] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9: Time spent bootstrapping tablet: real 0.007s	user 0.004s	sys 0.000s
I20260504 14:08:29.549116  2855 raft_consensus.cc:359] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } } peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } }
I20260504 14:08:29.549116  2856 raft_consensus.cc:359] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } } peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } }
I20260504 14:08:29.549345  2855 raft_consensus.cc:385] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:29.549345  2856 raft_consensus.cc:385] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:29.549415  2856 raft_consensus.cc:740] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3fc252a5c8fc4f69b1c0a18dde0aaddc, State: Initialized, Role: FOLLOWER
I20260504 14:08:29.549433  2855 raft_consensus.cc:740] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f04ea53f457241c9aa321af4ab8e451a, State: Initialized, Role: FOLLOWER
I20260504 14:08:29.549887  2856 consensus_queue.cc:260] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } } peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } }
I20260504 14:08:29.549887  2855 consensus_queue.cc:260] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } } peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } }
I20260504 14:08:29.549975  2854 raft_consensus.cc:359] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } } peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } }
I20260504 14:08:29.550206  2854 raft_consensus.cc:385] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:29.550292  2854 raft_consensus.cc:740] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: b2991172abc5408aa043125b7f6941f9, State: Initialized, Role: FOLLOWER
I20260504 14:08:29.550724  2854 consensus_queue.cc:260] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } } peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } }
I20260504 14:08:29.550832  2855 ts_tablet_manager.cc:1434] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a: Time spent starting tablet: real 0.005s	user 0.003s	sys 0.001s
I20260504 14:08:29.550832  2856 ts_tablet_manager.cc:1434] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Time spent starting tablet: real 0.005s	user 0.005s	sys 0.001s
I20260504 14:08:29.551210  2826 heartbeater.cc:499] Master 127.25.254.254:39351 was elected leader, sending a full tablet report...
I20260504 14:08:29.551215  2542 heartbeater.cc:499] Master 127.25.254.254:39351 was elected leader, sending a full tablet report...
I20260504 14:08:29.551367  2684 heartbeater.cc:499] Master 127.25.254.254:39351 was elected leader, sending a full tablet report...
I20260504 14:08:29.553452  2854 ts_tablet_manager.cc:1434] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9: Time spent starting tablet: real 0.006s	user 0.000s	sys 0.004s
I20260504 14:08:29.565379  2860 raft_consensus.cc:493] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:08:29.565574  2860 raft_consensus.cc:515] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } } peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } }
I20260504 14:08:29.566807  2860 leader_election.cc:290] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers b2991172abc5408aa043125b7f6941f9 (127.25.254.194:44849), 3fc252a5c8fc4f69b1c0a18dde0aaddc (127.25.254.195:44789)
I20260504 14:08:29.570020  2863 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.567136 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:44849 (local address 127.25.254.193:41711)
0504 14:08:29.567470 (+   334us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:29.567483 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:29.567638 (+   155us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:29.567942 (+   304us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:29.567946 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:29.567968 (+    22us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:29.568214 (+   246us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.568220 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.568985 (+   765us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.568988 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:29.569575 (+   587us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.569581 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.569731 (+   150us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.569749 (+    18us) client_negotiation.cc:770] Sending connection context
0504 14:08:29.569805 (+    56us) client_negotiation.cc:241] Negotiation successful
0504 14:08:29.569859 (+    54us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":259,"thread_start_us":165,"threads_started":1}
I20260504 14:08:29.570349  2864 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.567676 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:44789 (local address 127.25.254.193:55849)
0504 14:08:29.567966 (+   290us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:29.567978 (+    12us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:29.568101 (+   123us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:29.568341 (+   240us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:29.568344 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:29.568360 (+    16us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:29.568615 (+   255us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.568620 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.569341 (+   721us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.569345 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:29.569920 (+   575us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.569928 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.570058 (+   130us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.570073 (+    15us) client_negotiation.cc:770] Sending connection context
0504 14:08:29.570123 (+    50us) client_negotiation.cc:241] Negotiation successful
0504 14:08:29.570195 (+    72us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":220,"thread_start_us":91,"threads_started":1}
I20260504 14:08:29.570519  2849 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.567209 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:41711 (local address 127.25.254.194:44849)
0504 14:08:29.567366 (+   157us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:29.567371 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:29.567562 (+   191us) server_negotiation.cc:408] Connection header received
0504 14:08:29.567768 (+   206us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:29.567775 (+     7us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:29.567836 (+    61us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:29.567929 (+    93us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:29.568330 (+   401us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.568841 (+   511us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.569713 (+   872us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.570244 (+   531us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.570279 (+    35us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:29.570334 (+    55us) server_negotiation.cc:300] Negotiation successful
0504 14:08:29.570394 (+    60us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":55}
I20260504 14:08:29.570827  2851 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.567752 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:55849 (local address 127.25.254.195:44789)
0504 14:08:29.567903 (+   151us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:29.567907 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:29.568056 (+   149us) server_negotiation.cc:408] Connection header received
0504 14:08:29.568209 (+   153us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:29.568212 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:29.568251 (+    39us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:29.568331 (+    80us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:29.568736 (+   405us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.569229 (+   493us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.570041 (+   812us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.570560 (+   519us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.570597 (+    37us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:29.570654 (+    57us) server_negotiation.cc:300] Negotiation successful
0504 14:08:29.570713 (+    59us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":59}
I20260504 14:08:29.571280  2638 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "04c0e2f71d864d0eb15d48581d1f701b" candidate_uuid: "f04ea53f457241c9aa321af4ab8e451a" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "b2991172abc5408aa043125b7f6941f9" is_pre_election: true
I20260504 14:08:29.571395  2780 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "04c0e2f71d864d0eb15d48581d1f701b" candidate_uuid: "f04ea53f457241c9aa321af4ab8e451a" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" is_pre_election: true
I20260504 14:08:29.571576  2638 raft_consensus.cc:2468] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate f04ea53f457241c9aa321af4ab8e451a in term 0.
I20260504 14:08:29.571673  2780 raft_consensus.cc:2468] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate f04ea53f457241c9aa321af4ab8e451a in term 0.
I20260504 14:08:29.572113  2427 leader_election.cc:304] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: b2991172abc5408aa043125b7f6941f9, f04ea53f457241c9aa321af4ab8e451a; no voters: 
I20260504 14:08:29.572378  2860 raft_consensus.cc:2804] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:08:29.572454  2860 raft_consensus.cc:493] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:08:29.572521  2860 raft_consensus.cc:3060] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:29.573566  2860 raft_consensus.cc:515] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } } peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } }
I20260504 14:08:29.573930  2860 leader_election.cc:290] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [CANDIDATE]: Term 1 election: Requested vote from peers b2991172abc5408aa043125b7f6941f9 (127.25.254.194:44849), 3fc252a5c8fc4f69b1c0a18dde0aaddc (127.25.254.195:44789)
I20260504 14:08:29.574291  2638 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "04c0e2f71d864d0eb15d48581d1f701b" candidate_uuid: "f04ea53f457241c9aa321af4ab8e451a" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "b2991172abc5408aa043125b7f6941f9"
I20260504 14:08:29.574349  2780 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "04c0e2f71d864d0eb15d48581d1f701b" candidate_uuid: "f04ea53f457241c9aa321af4ab8e451a" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc"
I20260504 14:08:29.574429  2638 raft_consensus.cc:3060] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:29.574489  2780 raft_consensus.cc:3060] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:29.575410  2638 raft_consensus.cc:2468] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate f04ea53f457241c9aa321af4ab8e451a in term 1.
I20260504 14:08:29.575563  2780 raft_consensus.cc:2468] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate f04ea53f457241c9aa321af4ab8e451a in term 1.
I20260504 14:08:29.575811  2427 leader_election.cc:304] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: b2991172abc5408aa043125b7f6941f9, f04ea53f457241c9aa321af4ab8e451a; no voters: 
I20260504 14:08:29.576038  2860 raft_consensus.cc:2804] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:08:29.576267  2860 raft_consensus.cc:697] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [term 1 LEADER]: Becoming Leader. State: Replica: f04ea53f457241c9aa321af4ab8e451a, State: Running, Role: LEADER
I20260504 14:08:29.576572  2860 consensus_queue.cc:237] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } } peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } }
I20260504 14:08:29.579857  2337 catalog_manager.cc:5671] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a reported cstate change: term changed from 0 to 1, leader changed from <none> to f04ea53f457241c9aa321af4ab8e451a (127.25.254.193). New cstate: current_term: 1 leader_uuid: "f04ea53f457241c9aa321af4ab8e451a" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } health_report { overall_health: HEALTHY } } }
I20260504 14:08:29.594470  2853 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.591352 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:57612 (local address 127.25.254.193:33293)
0504 14:08:29.591514 (+   162us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:29.591518 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:29.591534 (+    16us) server_negotiation.cc:408] Connection header received
0504 14:08:29.591695 (+   161us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:29.591699 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:29.591780 (+    81us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:29.591893 (+   113us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:08:29.592254 (+   361us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.592748 (+   494us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.593411 (+   663us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.593615 (+   204us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.593666 (+    51us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:08:29.594046 (+   380us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:08:29.594168 (+   122us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:29.594271 (+   103us) server_negotiation.cc:300] Negotiation successful
0504 14:08:29.594319 (+    48us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":55}
I20260504 14:08:29.600232  2638 raft_consensus.cc:1275] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9 [term 1 FOLLOWER]: Refusing update from remote peer f04ea53f457241c9aa321af4ab8e451a: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:08:29.600225  2780 raft_consensus.cc:1275] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 1 FOLLOWER]: Refusing update from remote peer f04ea53f457241c9aa321af4ab8e451a: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:08:29.601018  2865 consensus_queue.cc:1048] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [LEADER]: Connected to new peer: Peer: permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:08:29.601279  2860 consensus_queue.cc:1048] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [LEADER]: Connected to new peer: Peer: permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:08:29.612229  2870 mvcc.cc:204] Tried to move back new op lower bound from 7282293594515124224 to 7282293594428592128. Current Snapshot: MvccSnapshot[applied={T|T < 7282293594515124224}]
I20260504 14:08:29.618242  2337 catalog_manager.cc:2507] Servicing SoftDeleteTable request from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:55228:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:08:29.618427  2337 catalog_manager.cc:2755] Servicing DeleteTable request from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:55228:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:08:29.621155  2337 catalog_manager.cc:5958] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516: Sending DeleteTablet for 3 replicas of tablet 04c0e2f71d864d0eb15d48581d1f701b
I20260504 14:08:29.622032  2618 tablet_service.cc:1558] Processing DeleteTablet for tablet 04c0e2f71d864d0eb15d48581d1f701b with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:29 UTC) from {username='oryx', principal='oryx/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:49956
I20260504 14:08:29.622032  2760 tablet_service.cc:1558] Processing DeleteTablet for tablet 04c0e2f71d864d0eb15d48581d1f701b with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:29 UTC) from {username='oryx', principal='oryx/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:51874
I20260504 14:08:29.622602  2879 tablet_replica.cc:333] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9: stopping tablet replica
I20260504 14:08:29.622830  2337 catalog_manager.cc:2257] Servicing CreateTable request from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:55228:
name: "test-table"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "val"
    type: INT32
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
I20260504 14:08:29.622845  2476 tablet_service.cc:1558] Processing DeleteTablet for tablet 04c0e2f71d864d0eb15d48581d1f701b with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:29 UTC) from {username='oryx', principal='oryx/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:57610
W20260504 14:08:29.623262  2337 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-table in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260504 14:08:29.623523  2881 tablet_replica.cc:333] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a: stopping tablet replica
I20260504 14:08:29.623797  2881 raft_consensus.cc:2243] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [term 1 LEADER]: Raft consensus shutting down.
I20260504 14:08:29.624161  2880 tablet_replica.cc:333] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc: stopping tablet replica
I20260504 14:08:29.624231  2881 raft_consensus.cc:2272] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:29.624428  2880 raft_consensus.cc:2243] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:08:29.624735  2880 raft_consensus.cc:2272] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:29.625144  2879 raft_consensus.cc:2243] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:08:29.625478  2879 raft_consensus.cc:2272] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:29.626966  2880 ts_tablet_manager.cc:1916] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:29.627220  2881 ts_tablet_manager.cc:1916] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:29.628790  2476 tablet_service.cc:1511] Processing CreateTablet for tablet a3b88886f47c4524a0e07728863f3826 (DEFAULT_TABLE table=test-table [id=bfcfa999dbb5458dacb3ef59fa7605a2]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:29.629082  2476 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet a3b88886f47c4524a0e07728863f3826. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:29.629194  2618 tablet_service.cc:1511] Processing CreateTablet for tablet a3b88886f47c4524a0e07728863f3826 (DEFAULT_TABLE table=test-table [id=bfcfa999dbb5458dacb3ef59fa7605a2]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:29.629330  2760 tablet_service.cc:1511] Processing CreateTablet for tablet a3b88886f47c4524a0e07728863f3826 (DEFAULT_TABLE table=test-table [id=bfcfa999dbb5458dacb3ef59fa7605a2]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:29.629449  2618 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet a3b88886f47c4524a0e07728863f3826. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:29.629542  2760 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet a3b88886f47c4524a0e07728863f3826. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:29.631608  2854 tablet_bootstrap.cc:492] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9: Bootstrap starting.
I20260504 14:08:29.631633  2856 tablet_bootstrap.cc:492] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Bootstrap starting.
I20260504 14:08:29.631640  2855 tablet_bootstrap.cc:492] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a: Bootstrap starting.
I20260504 14:08:29.631742  2879 ts_tablet_manager.cc:1916] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:29.632613  2854 tablet_bootstrap.cc:654] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:29.632637  2856 tablet_bootstrap.cc:654] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:29.632639  2855 tablet_bootstrap.cc:654] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:29.633870  2880 ts_tablet_manager.cc:1929] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:08:29.633980  2880 log.cc:1199] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/wal/wals/04c0e2f71d864d0eb15d48581d1f701b
I20260504 14:08:29.634328  2880 ts_tablet_manager.cc:1950] T 04c0e2f71d864d0eb15d48581d1f701b P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Deleting consensus metadata
I20260504 14:08:29.634490  2879 ts_tablet_manager.cc:1929] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:08:29.634577  2879 log.cc:1199] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/wal/wals/04c0e2f71d864d0eb15d48581d1f701b
I20260504 14:08:29.634891  2879 ts_tablet_manager.cc:1950] T 04c0e2f71d864d0eb15d48581d1f701b P b2991172abc5408aa043125b7f6941f9: Deleting consensus metadata
I20260504 14:08:29.635257  2881 ts_tablet_manager.cc:1929] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:08:29.635339  2881 log.cc:1199] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/wal/wals/04c0e2f71d864d0eb15d48581d1f701b
I20260504 14:08:29.635619  2881 ts_tablet_manager.cc:1950] T 04c0e2f71d864d0eb15d48581d1f701b P f04ea53f457241c9aa321af4ab8e451a: Deleting consensus metadata
I20260504 14:08:29.635737  2321 catalog_manager.cc:5002] TS 3fc252a5c8fc4f69b1c0a18dde0aaddc (127.25.254.195:44789): tablet 04c0e2f71d864d0eb15d48581d1f701b (table test-table [id=19ff76d95b2b426b851b0c551fdbf9f5]) successfully deleted
I20260504 14:08:29.636770  2321 catalog_manager.cc:5002] TS f04ea53f457241c9aa321af4ab8e451a (127.25.254.193:33293): tablet 04c0e2f71d864d0eb15d48581d1f701b (table test-table [id=19ff76d95b2b426b851b0c551fdbf9f5]) successfully deleted
I20260504 14:08:29.637270  2854 tablet_bootstrap.cc:492] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9: No bootstrap required, opened a new log
I20260504 14:08:29.637367  2854 ts_tablet_manager.cc:1403] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9: Time spent bootstrapping tablet: real 0.006s	user 0.001s	sys 0.001s
I20260504 14:08:29.637866  2854 raft_consensus.cc:359] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:29.637995  2854 raft_consensus.cc:385] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:29.638031  2854 raft_consensus.cc:740] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: b2991172abc5408aa043125b7f6941f9, State: Initialized, Role: FOLLOWER
I20260504 14:08:29.638182  2854 consensus_queue.cc:260] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:29.638453  2854 ts_tablet_manager.cc:1434] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:08:29.640044  2855 tablet_bootstrap.cc:492] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a: No bootstrap required, opened a new log
I20260504 14:08:29.640157  2855 ts_tablet_manager.cc:1403] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a: Time spent bootstrapping tablet: real 0.009s	user 0.000s	sys 0.002s
I20260504 14:08:29.640327  2856 tablet_bootstrap.cc:492] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: No bootstrap required, opened a new log
I20260504 14:08:29.640420  2856 ts_tablet_manager.cc:1403] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Time spent bootstrapping tablet: real 0.009s	user 0.000s	sys 0.002s
I20260504 14:08:29.640839  2855 raft_consensus.cc:359] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:29.640925  2856 raft_consensus.cc:359] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:29.640997  2855 raft_consensus.cc:385] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:29.641038  2856 raft_consensus.cc:385] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:29.641038  2855 raft_consensus.cc:740] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f04ea53f457241c9aa321af4ab8e451a, State: Initialized, Role: FOLLOWER
I20260504 14:08:29.641078  2856 raft_consensus.cc:740] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3fc252a5c8fc4f69b1c0a18dde0aaddc, State: Initialized, Role: FOLLOWER
I20260504 14:08:29.641211  2856 consensus_queue.cc:260] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:29.641186  2855 consensus_queue.cc:260] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:29.641469  2856 ts_tablet_manager.cc:1434] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.000s
I20260504 14:08:29.641518  2855 ts_tablet_manager.cc:1434] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.001s
I20260504 14:08:29.643376  2321 catalog_manager.cc:5002] TS b2991172abc5408aa043125b7f6941f9 (127.25.254.194:44849): tablet 04c0e2f71d864d0eb15d48581d1f701b (table test-table [id=19ff76d95b2b426b851b0c551fdbf9f5]) successfully deleted
W20260504 14:08:29.719554  2827 tablet.cc:2404] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20260504 14:08:29.733803  2685 tablet.cc:2404] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20260504 14:08:29.734495  2543 tablet.cc:2404] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20260504 14:08:29.768731  2862 raft_consensus.cc:493] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:08:29.768930  2862 raft_consensus.cc:515] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:29.770328  2862 leader_election.cc:290] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers f04ea53f457241c9aa321af4ab8e451a (127.25.254.193:33293), 3fc252a5c8fc4f69b1c0a18dde0aaddc (127.25.254.195:44789)
I20260504 14:08:29.774111  2886 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.771054 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:44789 (local address 127.25.254.194:32823)
0504 14:08:29.771297 (+   243us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:29.771311 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:29.771399 (+    88us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:29.771680 (+   281us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:29.771684 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:29.771698 (+    14us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:29.771912 (+   214us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.771921 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.772980 (+  1059us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.772984 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:29.773647 (+   663us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.773655 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.773799 (+   144us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.773819 (+    20us) client_negotiation.cc:770] Sending connection context
0504 14:08:29.773872 (+    53us) client_negotiation.cc:241] Negotiation successful
0504 14:08:29.773930 (+    58us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":184,"thread_start_us":51,"threads_started":1}
I20260504 14:08:29.774552  2851 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.771177 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:32823 (local address 127.25.254.195:44789)
0504 14:08:29.771308 (+   131us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:29.771312 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:29.771364 (+    52us) server_negotiation.cc:408] Connection header received
0504 14:08:29.771491 (+   127us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:29.771494 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:29.771548 (+    54us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:29.771649 (+   101us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:29.772104 (+   455us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.772813 (+   709us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.773782 (+   969us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.774290 (+   508us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.774327 (+    37us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:29.774380 (+    53us) server_negotiation.cc:300] Negotiation successful
0504 14:08:29.774428 (+    48us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":51}
I20260504 14:08:29.775004  2853 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.770729 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:50545 (local address 127.25.254.193:33293)
0504 14:08:29.770894 (+   165us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:29.770899 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:29.771019 (+   120us) server_negotiation.cc:408] Connection header received
0504 14:08:29.771258 (+   239us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:29.771262 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:29.771321 (+    59us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:29.771399 (+    78us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:29.771994 (+   595us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.772777 (+   783us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.774040 (+  1263us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.774551 (+   511us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.774587 (+    36us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:29.774813 (+   226us) server_negotiation.cc:300] Negotiation successful
0504 14:08:29.774889 (+    76us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":64}
I20260504 14:08:29.775015  2780 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "a3b88886f47c4524a0e07728863f3826" candidate_uuid: "b2991172abc5408aa043125b7f6941f9" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" is_pre_election: true
I20260504 14:08:29.775184  2780 raft_consensus.cc:2468] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate b2991172abc5408aa043125b7f6941f9 in term 0.
I20260504 14:08:29.775408  2885 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.770553 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:33293 (local address 127.25.254.194:50545)
0504 14:08:29.770891 (+   338us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:29.770945 (+    54us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:29.771108 (+   163us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:29.771547 (+   439us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:29.771549 (+     2us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:29.771563 (+    14us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:29.771852 (+   289us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.771858 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.772943 (+  1085us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.772947 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:29.773835 (+   888us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.773847 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.774657 (+   810us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.774671 (+    14us) client_negotiation.cc:770] Sending connection context
0504 14:08:29.775206 (+   535us) client_negotiation.cc:241] Negotiation successful
0504 14:08:29.775270 (+    64us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":250,"thread_start_us":172,"threads_started":1}
I20260504 14:08:29.775540  2569 leader_election.cc:304] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3fc252a5c8fc4f69b1c0a18dde0aaddc, b2991172abc5408aa043125b7f6941f9; no voters: 
I20260504 14:08:29.775825  2862 raft_consensus.cc:2804] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:08:29.775905  2862 raft_consensus.cc:493] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:08:29.775949  2862 raft_consensus.cc:3060] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:29.776324  2495 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "a3b88886f47c4524a0e07728863f3826" candidate_uuid: "b2991172abc5408aa043125b7f6941f9" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f04ea53f457241c9aa321af4ab8e451a" is_pre_election: true
I20260504 14:08:29.776521  2495 raft_consensus.cc:2468] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate b2991172abc5408aa043125b7f6941f9 in term 0.
I20260504 14:08:29.777170  2862 raft_consensus.cc:515] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:29.777540  2862 leader_election.cc:290] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [CANDIDATE]: Term 1 election: Requested vote from peers f04ea53f457241c9aa321af4ab8e451a (127.25.254.193:33293), 3fc252a5c8fc4f69b1c0a18dde0aaddc (127.25.254.195:44789)
I20260504 14:08:29.777956  2495 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "a3b88886f47c4524a0e07728863f3826" candidate_uuid: "b2991172abc5408aa043125b7f6941f9" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f04ea53f457241c9aa321af4ab8e451a"
I20260504 14:08:29.778025  2780 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "a3b88886f47c4524a0e07728863f3826" candidate_uuid: "b2991172abc5408aa043125b7f6941f9" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc"
I20260504 14:08:29.778081  2495 raft_consensus.cc:3060] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:29.778190  2780 raft_consensus.cc:3060] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:29.779026  2780 raft_consensus.cc:2468] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate b2991172abc5408aa043125b7f6941f9 in term 1.
I20260504 14:08:29.779335  2495 raft_consensus.cc:2468] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate b2991172abc5408aa043125b7f6941f9 in term 1.
I20260504 14:08:29.779421  2569 leader_election.cc:304] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3fc252a5c8fc4f69b1c0a18dde0aaddc, b2991172abc5408aa043125b7f6941f9; no voters: 
I20260504 14:08:29.779610  2862 raft_consensus.cc:2804] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:08:29.779805  2862 raft_consensus.cc:697] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [term 1 LEADER]: Becoming Leader. State: Replica: b2991172abc5408aa043125b7f6941f9, State: Running, Role: LEADER
I20260504 14:08:29.780076  2862 consensus_queue.cc:237] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:29.782943  2337 catalog_manager.cc:5671] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 reported cstate change: term changed from 0 to 1, leader changed from <none> to b2991172abc5408aa043125b7f6941f9 (127.25.254.194). New cstate: current_term: 1 leader_uuid: "b2991172abc5408aa043125b7f6941f9" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } health_report { overall_health: UNKNOWN } } }
May 04 14:08:29 dist-test-slave-2x32 krb5kdc[2289](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903708, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.254@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM
I20260504 14:08:29.820948  2894 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.812268 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:39351 (local address 127.0.0.1:55244)
0504 14:08:29.812601 (+   333us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:29.812617 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:29.812780 (+   163us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:29.813068 (+   288us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:29.813071 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:29.813529 (+   458us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:29.813723 (+   194us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.813729 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.814738 (+  1009us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.814744 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:29.815159 (+   415us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:29.815166 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.815312 (+   146us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.815995 (+   683us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:29.816019 (+    24us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:29.817603 (+  1584us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:29.819320 (+  1717us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:29.819330 (+    10us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:29.819335 (+     5us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:29.819626 (+   291us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:29.819911 (+   285us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:29.819918 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:29.819920 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:29.819966 (+    46us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:29.820329 (+   363us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:29.820336 (+     7us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:29.820436 (+   100us) client_negotiation.cc:770] Sending connection context
0504 14:08:29.820561 (+   125us) client_negotiation.cc:241] Negotiation successful
0504 14:08:29.820746 (+   185us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":243,"thread_start_us":125,"threads_started":1}
I20260504 14:08:29.820948  2386 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:29.812369 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:55244 (local address 127.25.254.254:39351)
0504 14:08:29.812533 (+   164us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:29.812538 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:29.812714 (+   176us) server_negotiation.cc:408] Connection header received
0504 14:08:29.812914 (+   200us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:29.812917 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:29.812963 (+    46us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:29.813049 (+    86us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:29.813858 (+   809us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.814551 (+   693us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:29.815285 (+   734us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:29.815465 (+   180us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:29.817740 (+  2275us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:29.817767 (+    27us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:29.817769 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:29.817791 (+    22us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:29.819178 (+  1387us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:29.819731 (+   553us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:29.819736 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:29.819737 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:29.819785 (+    48us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:29.820062 (+   277us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:29.820065 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:29.820067 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:29.820214 (+   147us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:29.820312 (+    98us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:29.820627 (+   315us) server_negotiation.cc:300] Negotiation successful
0504 14:08:29.820809 (+   182us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"mutex_wait_us":76,"server-negotiator.queue_time_us":56}
I20260504 14:08:29.825356  2337 catalog_manager.cc:2257] Servicing CreateTable request from {username='oryx', principal='oryx/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:55244:
name: "kudu_system.kudu_transactions"
schema {
  columns {
    name: "txn_id"
    type: INT64
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "entry_type"
    type: INT8
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "identifier"
    type: STRING
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "metadata"
    type: STRING
    is_key: false
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
  rows: "<redacted>""\006\001\000\000\000\000\000\000\000\000\007\001@B\017\000\000\000\000\000"
  indirect_data: "<redacted>"""
}
partition_schema {
  range_schema {
    columns {
      name: "txn_id"
    }
  }
}
table_type: TXN_STATUS_TABLE
W20260504 14:08:29.826323  2337 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table kudu_system.kudu_transactions in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260504 14:08:29.832206  2476 tablet_service.cc:1511] Processing CreateTablet for tablet 1a6d2d2e3f0c42159160308ce83d7303 (TXN_STATUS_TABLE table=kudu_system.kudu_transactions [id=e6faa37c6cb5452c8f92b3c5bcd0ff9f]), partition=RANGE (txn_id) PARTITION 0 <= VALUES < 1000000
I20260504 14:08:29.832206  2760 tablet_service.cc:1511] Processing CreateTablet for tablet 1a6d2d2e3f0c42159160308ce83d7303 (TXN_STATUS_TABLE table=kudu_system.kudu_transactions [id=e6faa37c6cb5452c8f92b3c5bcd0ff9f]), partition=RANGE (txn_id) PARTITION 0 <= VALUES < 1000000
I20260504 14:08:29.832206  2618 tablet_service.cc:1511] Processing CreateTablet for tablet 1a6d2d2e3f0c42159160308ce83d7303 (TXN_STATUS_TABLE table=kudu_system.kudu_transactions [id=e6faa37c6cb5452c8f92b3c5bcd0ff9f]), partition=RANGE (txn_id) PARTITION 0 <= VALUES < 1000000
I20260504 14:08:29.832607  2476 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 1a6d2d2e3f0c42159160308ce83d7303. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:29.832620  2618 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 1a6d2d2e3f0c42159160308ce83d7303. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:29.832631  2760 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 1a6d2d2e3f0c42159160308ce83d7303. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:29.835009  2855 tablet_bootstrap.cc:492] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a: Bootstrap starting.
I20260504 14:08:29.835409  2854 tablet_bootstrap.cc:492] T 1a6d2d2e3f0c42159160308ce83d7303 P b2991172abc5408aa043125b7f6941f9: Bootstrap starting.
I20260504 14:08:29.836021  2855 tablet_bootstrap.cc:654] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:29.836459  2856 tablet_bootstrap.cc:492] T 1a6d2d2e3f0c42159160308ce83d7303 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Bootstrap starting.
I20260504 14:08:29.837230  2855 tablet_bootstrap.cc:492] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a: No bootstrap required, opened a new log
I20260504 14:08:29.837307  2855 ts_tablet_manager.cc:1403] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a: Time spent bootstrapping tablet: real 0.002s	user 0.001s	sys 0.001s
I20260504 14:08:29.837502  2856 tablet_bootstrap.cc:654] T 1a6d2d2e3f0c42159160308ce83d7303 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:29.837776  2855 raft_consensus.cc:359] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:29.837898  2855 raft_consensus.cc:385] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:29.837940  2855 raft_consensus.cc:740] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f04ea53f457241c9aa321af4ab8e451a, State: Initialized, Role: FOLLOWER
I20260504 14:08:29.838084  2855 consensus_queue.cc:260] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:29.838423  2860 tablet_replica.cc:442] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a: TxnStatusTablet state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 0 committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } } }
I20260504 14:08:29.838565  2860 tablet_replica.cc:445] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:08:29.838780  2854 tablet_bootstrap.cc:654] T 1a6d2d2e3f0c42159160308ce83d7303 P b2991172abc5408aa043125b7f6941f9: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:29.838429  2855 ts_tablet_manager.cc:1434] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:08:29.840174  2856 tablet_bootstrap.cc:492] T 1a6d2d2e3f0c42159160308ce83d7303 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: No bootstrap required, opened a new log
I20260504 14:08:29.840171  2854 tablet_bootstrap.cc:492] T 1a6d2d2e3f0c42159160308ce83d7303 P b2991172abc5408aa043125b7f6941f9: No bootstrap required, opened a new log
I20260504 14:08:29.840262  2856 ts_tablet_manager.cc:1403] T 1a6d2d2e3f0c42159160308ce83d7303 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Time spent bootstrapping tablet: real 0.004s	user 0.001s	sys 0.001s
I20260504 14:08:29.840262  2854 ts_tablet_manager.cc:1403] T 1a6d2d2e3f0c42159160308ce83d7303 P b2991172abc5408aa043125b7f6941f9: Time spent bootstrapping tablet: real 0.005s	user 0.002s	sys 0.000s
I20260504 14:08:29.840725  2854 raft_consensus.cc:359] T 1a6d2d2e3f0c42159160308ce83d7303 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:29.840731  2856 raft_consensus.cc:359] T 1a6d2d2e3f0c42159160308ce83d7303 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:29.840830  2856 raft_consensus.cc:385] T 1a6d2d2e3f0c42159160308ce83d7303 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:29.840830  2854 raft_consensus.cc:385] T 1a6d2d2e3f0c42159160308ce83d7303 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:29.840864  2856 raft_consensus.cc:740] T 1a6d2d2e3f0c42159160308ce83d7303 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3fc252a5c8fc4f69b1c0a18dde0aaddc, State: Initialized, Role: FOLLOWER
I20260504 14:08:29.840864  2854 raft_consensus.cc:740] T 1a6d2d2e3f0c42159160308ce83d7303 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: b2991172abc5408aa043125b7f6941f9, State: Initialized, Role: FOLLOWER
I20260504 14:08:29.841007  2856 consensus_queue.cc:260] T 1a6d2d2e3f0c42159160308ce83d7303 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:29.841039  2854 consensus_queue.cc:260] T 1a6d2d2e3f0c42159160308ce83d7303 P b2991172abc5408aa043125b7f6941f9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:29.841280  2856 ts_tablet_manager.cc:1434] T 1a6d2d2e3f0c42159160308ce83d7303 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:08:29.841315  2854 ts_tablet_manager.cc:1434] T 1a6d2d2e3f0c42159160308ce83d7303 P b2991172abc5408aa043125b7f6941f9: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.000s
I20260504 14:08:29.841615  2861 tablet_replica.cc:442] T 1a6d2d2e3f0c42159160308ce83d7303 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: TxnStatusTablet state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 0 committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } } }
I20260504 14:08:29.841935  2861 tablet_replica.cc:445] T 1a6d2d2e3f0c42159160308ce83d7303 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:08:29.842136  2862 tablet_replica.cc:442] T 1a6d2d2e3f0c42159160308ce83d7303 P b2991172abc5408aa043125b7f6941f9: TxnStatusTablet state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 0 committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } } }
I20260504 14:08:29.842278  2862 tablet_replica.cc:445] T 1a6d2d2e3f0c42159160308ce83d7303 P b2991172abc5408aa043125b7f6941f9: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:08:30.201038  2862 consensus_queue.cc:1048] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [LEADER]: Connected to new peer: Peer: permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:08:30.205505  2862 consensus_queue.cc:1048] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:08:30.253532  2860 raft_consensus.cc:493] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:08:30.253742  2860 raft_consensus.cc:515] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:30.254276  2860 leader_election.cc:290] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers b2991172abc5408aa043125b7f6941f9 (127.25.254.194:44849), 3fc252a5c8fc4f69b1c0a18dde0aaddc (127.25.254.195:44789)
I20260504 14:08:30.254875  2638 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "1a6d2d2e3f0c42159160308ce83d7303" candidate_uuid: "f04ea53f457241c9aa321af4ab8e451a" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "b2991172abc5408aa043125b7f6941f9" is_pre_election: true
I20260504 14:08:30.254994  2780 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "1a6d2d2e3f0c42159160308ce83d7303" candidate_uuid: "f04ea53f457241c9aa321af4ab8e451a" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" is_pre_election: true
I20260504 14:08:30.255040  2638 raft_consensus.cc:2468] T 1a6d2d2e3f0c42159160308ce83d7303 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate f04ea53f457241c9aa321af4ab8e451a in term 0.
I20260504 14:08:30.255128  2780 raft_consensus.cc:2468] T 1a6d2d2e3f0c42159160308ce83d7303 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate f04ea53f457241c9aa321af4ab8e451a in term 0.
I20260504 14:08:30.255443  2427 leader_election.cc:304] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3fc252a5c8fc4f69b1c0a18dde0aaddc, f04ea53f457241c9aa321af4ab8e451a; no voters: 
I20260504 14:08:30.255659  2860 raft_consensus.cc:2804] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:08:30.255754  2860 raft_consensus.cc:493] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:08:30.255784  2860 raft_consensus.cc:3060] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:30.256719  2860 raft_consensus.cc:515] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:30.257066  2860 leader_election.cc:290] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a [CANDIDATE]: Term 1 election: Requested vote from peers b2991172abc5408aa043125b7f6941f9 (127.25.254.194:44849), 3fc252a5c8fc4f69b1c0a18dde0aaddc (127.25.254.195:44789)
I20260504 14:08:30.257526  2638 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "1a6d2d2e3f0c42159160308ce83d7303" candidate_uuid: "f04ea53f457241c9aa321af4ab8e451a" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "b2991172abc5408aa043125b7f6941f9"
I20260504 14:08:30.257596  2780 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "1a6d2d2e3f0c42159160308ce83d7303" candidate_uuid: "f04ea53f457241c9aa321af4ab8e451a" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc"
I20260504 14:08:30.257663  2638 raft_consensus.cc:3060] T 1a6d2d2e3f0c42159160308ce83d7303 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:30.257699  2780 raft_consensus.cc:3060] T 1a6d2d2e3f0c42159160308ce83d7303 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:30.258603  2638 raft_consensus.cc:2468] T 1a6d2d2e3f0c42159160308ce83d7303 P b2991172abc5408aa043125b7f6941f9 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate f04ea53f457241c9aa321af4ab8e451a in term 1.
I20260504 14:08:30.258616  2780 raft_consensus.cc:2468] T 1a6d2d2e3f0c42159160308ce83d7303 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate f04ea53f457241c9aa321af4ab8e451a in term 1.
I20260504 14:08:30.258970  2427 leader_election.cc:304] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: b2991172abc5408aa043125b7f6941f9, f04ea53f457241c9aa321af4ab8e451a; no voters: 
I20260504 14:08:30.259169  2860 raft_consensus.cc:2804] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:08:30.259370  2860 raft_consensus.cc:697] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a [term 1 LEADER]: Becoming Leader. State: Replica: f04ea53f457241c9aa321af4ab8e451a, State: Running, Role: LEADER
I20260504 14:08:30.259526  2860 consensus_queue.cc:237] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:30.260239  2903 tablet_replica.cc:442] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a: TxnStatusTablet state changed. Reason: New leader f04ea53f457241c9aa321af4ab8e451a. Latest consensus state: current_term: 1 leader_uuid: "f04ea53f457241c9aa321af4ab8e451a" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } } }
I20260504 14:08:30.260354  2903 tablet_replica.cc:445] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a: This TxnStatusTablet replica's current role is: LEADER
I20260504 14:08:30.260627  2905 txn_status_manager.cc:874] Waiting until node catch up with all replicated operations in previous term...
I20260504 14:08:30.260780  2905 txn_status_manager.cc:930] Loading transaction status metadata into memory...
I20260504 14:08:30.261471  2335 catalog_manager.cc:5671] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a reported cstate change: term changed from 0 to 1, leader changed from <none> to f04ea53f457241c9aa321af4ab8e451a (127.25.254.193). New cstate: current_term: 1 leader_uuid: "f04ea53f457241c9aa321af4ab8e451a" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } health_report { overall_health: UNKNOWN } } }
I20260504 14:08:30.299252  2908 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:30.296392 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:57616 (local address 127.25.254.193:33293)
0504 14:08:30.296724 (+   332us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:30.296728 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:30.296740 (+    12us) server_negotiation.cc:408] Connection header received
0504 14:08:30.296784 (+    44us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:30.296786 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:30.296834 (+    48us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:30.296960 (+   126us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:08:30.297312 (+   352us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.297828 (+   516us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:30.298518 (+   690us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.298727 (+   209us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:30.298765 (+    38us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:08:30.298837 (+    72us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:08:30.298925 (+    88us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:30.299059 (+   134us) server_negotiation.cc:300] Negotiation successful
0504 14:08:30.299107 (+    48us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":189,"thread_start_us":72,"threads_started":1}
I20260504 14:08:30.299247  2894 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:30.296226 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:33293 (local address 127.0.0.1:57616)
0504 14:08:30.296402 (+   176us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:30.296419 (+    17us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:30.296552 (+   133us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:30.296976 (+   424us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:30.296979 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:30.296987 (+     8us) client_negotiation.cc:190] Negotiated authn=TOKEN
0504 14:08:30.297187 (+   200us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:30.297193 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.297955 (+   762us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:30.297959 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:30.298391 (+   432us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:30.298397 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.298523 (+   126us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:30.298566 (+    43us) client_negotiation.cc:253] Sending TOKEN_EXCHANGE NegotiatePB request
0504 14:08:30.298937 (+   371us) client_negotiation.cc:272] Received TOKEN_EXCHANGE NegotiatePB response
0504 14:08:30.298952 (+    15us) client_negotiation.cc:770] Sending connection context
0504 14:08:30.299035 (+    83us) client_negotiation.cc:241] Negotiation successful
0504 14:08:30.299107 (+    72us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":57}
I20260504 14:08:30.302141  2780 raft_consensus.cc:1275] T 1a6d2d2e3f0c42159160308ce83d7303 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 1 FOLLOWER]: Refusing update from remote peer f04ea53f457241c9aa321af4ab8e451a: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:08:30.302618  2860 consensus_queue.cc:1048] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a [LEADER]: Connected to new peer: Peer: permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:08:30.302816  2638 raft_consensus.cc:1275] T 1a6d2d2e3f0c42159160308ce83d7303 P b2991172abc5408aa043125b7f6941f9 [term 1 FOLLOWER]: Refusing update from remote peer f04ea53f457241c9aa321af4ab8e451a: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:08:30.303972  2861 tablet_replica.cc:442] T 1a6d2d2e3f0c42159160308ce83d7303 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: TxnStatusTablet state changed. Reason: New leader f04ea53f457241c9aa321af4ab8e451a. Latest consensus state: current_term: 1 leader_uuid: "f04ea53f457241c9aa321af4ab8e451a" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } } }
I20260504 14:08:30.304075  2861 tablet_replica.cc:445] T 1a6d2d2e3f0c42159160308ce83d7303 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:08:30.304171  2860 consensus_queue.cc:1048] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a [LEADER]: Connected to new peer: Peer: permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:08:30.305923  2862 tablet_replica.cc:442] T 1a6d2d2e3f0c42159160308ce83d7303 P b2991172abc5408aa043125b7f6941f9: TxnStatusTablet state changed. Reason: New leader f04ea53f457241c9aa321af4ab8e451a. Latest consensus state: current_term: 1 leader_uuid: "f04ea53f457241c9aa321af4ab8e451a" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } } }
I20260504 14:08:30.306033  2862 tablet_replica.cc:445] T 1a6d2d2e3f0c42159160308ce83d7303 P b2991172abc5408aa043125b7f6941f9: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:08:30.306001  2860 tablet_replica.cc:442] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a: TxnStatusTablet state changed. Reason: Peer health change. Latest consensus state: current_term: 1 leader_uuid: "f04ea53f457241c9aa321af4ab8e451a" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } } }
I20260504 14:08:30.306111  2860 tablet_replica.cc:445] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a: This TxnStatusTablet replica's current role is: LEADER
I20260504 14:08:30.306226  2902 mvcc.cc:204] Tried to move back new op lower bound from 7282293597392789504 to 7282293597224452096. Current Snapshot: MvccSnapshot[applied={T|T < 7282293597392789504}]
I20260504 14:08:30.307258  2861 tablet_replica.cc:442] T 1a6d2d2e3f0c42159160308ce83d7303 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: TxnStatusTablet state changed. Reason: Replicated consensus-only round. Latest consensus state: current_term: 1 leader_uuid: "f04ea53f457241c9aa321af4ab8e451a" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } } }
I20260504 14:08:30.307376  2861 tablet_replica.cc:445] T 1a6d2d2e3f0c42159160308ce83d7303 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:08:30.307549  2900 mvcc.cc:204] Tried to move back new op lower bound from 7282293597392789504 to 7282293597224452096. Current Snapshot: MvccSnapshot[applied={T|T < 7282293597392789504}]
I20260504 14:08:30.309749  2903 tablet_replica.cc:442] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a: TxnStatusTablet state changed. Reason: Peer health change. Latest consensus state: current_term: 1 leader_uuid: "f04ea53f457241c9aa321af4ab8e451a" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } } }
I20260504 14:08:30.309882  2903 tablet_replica.cc:445] T 1a6d2d2e3f0c42159160308ce83d7303 P f04ea53f457241c9aa321af4ab8e451a: This TxnStatusTablet replica's current role is: LEADER
I20260504 14:08:30.310487  2862 tablet_replica.cc:442] T 1a6d2d2e3f0c42159160308ce83d7303 P b2991172abc5408aa043125b7f6941f9: TxnStatusTablet state changed. Reason: Replicated consensus-only round. Latest consensus state: current_term: 1 leader_uuid: "f04ea53f457241c9aa321af4ab8e451a" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } } }
I20260504 14:08:30.310595  2862 tablet_replica.cc:445] T 1a6d2d2e3f0c42159160308ce83d7303 P b2991172abc5408aa043125b7f6941f9: This TxnStatusTablet replica's current role is: FOLLOWER
I20260504 14:08:30.318238  2917 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:30.314623 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:49966 (local address 127.25.254.194:44849)
0504 14:08:30.314839 (+   216us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:30.314842 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:30.314926 (+    84us) server_negotiation.cc:408] Connection header received
0504 14:08:30.315088 (+   162us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:30.315091 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:30.315142 (+    51us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:30.315232 (+    90us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:08:30.315658 (+   426us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.316152 (+   494us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:30.317205 (+  1053us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.317345 (+   140us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:30.317390 (+    45us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:08:30.317762 (+   372us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:08:30.317848 (+    86us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:30.317979 (+   131us) server_negotiation.cc:300] Negotiation successful
0504 14:08:30.318021 (+    42us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":148,"thread_start_us":65,"threads_started":1}
I20260504 14:08:30.328646  2908 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:30.325549 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:35919 (local address 127.25.254.193:33293)
0504 14:08:30.325723 (+   174us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:30.325727 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:30.325807 (+    80us) server_negotiation.cc:408] Connection header received
0504 14:08:30.325963 (+   156us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:30.325965 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:30.326018 (+    53us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:30.326081 (+    63us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:08:30.326578 (+   497us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.327121 (+   543us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:30.328040 (+   919us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.328192 (+   152us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:30.328229 (+    37us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:08:30.328310 (+    81us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:08:30.328381 (+    71us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:30.328495 (+   114us) server_negotiation.cc:300] Negotiation successful
0504 14:08:30.328537 (+    42us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":100}
I20260504 14:08:30.328701  2920 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:30.325407 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:33293 (local address 127.25.254.194:35919)
0504 14:08:30.325721 (+   314us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:30.325736 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:30.325842 (+   106us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:30.326113 (+   271us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:30.326117 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:30.326126 (+     9us) client_negotiation.cc:190] Negotiated authn=TOKEN
0504 14:08:30.326438 (+   312us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:30.326446 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.327242 (+   796us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:30.327246 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:30.327903 (+   657us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:30.327912 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.328032 (+   120us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:30.328070 (+    38us) client_negotiation.cc:253] Sending TOKEN_EXCHANGE NegotiatePB request
0504 14:08:30.328408 (+   338us) client_negotiation.cc:272] Received TOKEN_EXCHANGE NegotiatePB response
0504 14:08:30.328414 (+     6us) client_negotiation.cc:770] Sending connection context
0504 14:08:30.328482 (+    68us) client_negotiation.cc:241] Negotiation successful
0504 14:08:30.328546 (+    64us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":203,"thread_start_us":104,"threads_started":1}
I20260504 14:08:30.356714  2925 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:30.353280 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:44849 (local address 127.25.254.193:41877)
0504 14:08:30.353584 (+   304us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:30.353600 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:30.353713 (+   113us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:30.354007 (+   294us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:30.354010 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:30.354018 (+     8us) client_negotiation.cc:190] Negotiated authn=TOKEN
0504 14:08:30.354330 (+   312us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:30.354337 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.355255 (+   918us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:30.355261 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:30.355904 (+   643us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:30.355914 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.356026 (+   112us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:30.356071 (+    45us) client_negotiation.cc:253] Sending TOKEN_EXCHANGE NegotiatePB request
0504 14:08:30.356436 (+   365us) client_negotiation.cc:272] Received TOKEN_EXCHANGE NegotiatePB response
0504 14:08:30.356445 (+     9us) client_negotiation.cc:770] Sending connection context
0504 14:08:30.356506 (+    61us) client_negotiation.cc:241] Negotiation successful
0504 14:08:30.356572 (+    66us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":182,"thread_start_us":103,"threads_started":1}
I20260504 14:08:30.356707  2917 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:30.353330 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:41877 (local address 127.25.254.194:44849)
0504 14:08:30.353473 (+   143us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:30.353478 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:30.353757 (+   279us) server_negotiation.cc:408] Connection header received
0504 14:08:30.353845 (+    88us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:30.353848 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:30.353900 (+    52us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:30.354000 (+   100us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:08:30.354483 (+   483us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.355142 (+   659us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:30.356026 (+   884us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.356237 (+   211us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:30.356276 (+    39us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:08:30.356341 (+    65us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:08:30.356429 (+    88us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:30.356526 (+    97us) server_negotiation.cc:300] Negotiation successful
0504 14:08:30.356569 (+    43us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":54}
I20260504 14:08:30.386739  2335 catalog_manager.cc:2507] Servicing SoftDeleteTable request from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:55228:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:08:30.386929  2335 catalog_manager.cc:2755] Servicing DeleteTable request from {username='test-user', principal='test-user@KRBTEST.COM'} at 127.0.0.1:55228:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:08:30.389387  2335 catalog_manager.cc:5958] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516: Sending DeleteTablet for 3 replicas of tablet a3b88886f47c4524a0e07728863f3826
I20260504 14:08:30.390079  2618 tablet_service.cc:1558] Processing DeleteTablet for tablet a3b88886f47c4524a0e07728863f3826 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:30 UTC) from {username='oryx', principal='oryx/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:49956
I20260504 14:08:30.390407  2760 tablet_service.cc:1558] Processing DeleteTablet for tablet a3b88886f47c4524a0e07728863f3826 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:30 UTC) from {username='oryx', principal='oryx/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:51874
I20260504 14:08:30.390482  2928 tablet_replica.cc:333] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9: stopping tablet replica
I20260504 14:08:30.390661  2928 raft_consensus.cc:2243] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [term 1 LEADER]: Raft consensus shutting down.
I20260504 14:08:30.390808  2476 tablet_service.cc:1558] Processing DeleteTablet for tablet a3b88886f47c4524a0e07728863f3826 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:30 UTC) from {username='oryx', principal='oryx/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:57610
I20260504 14:08:30.391004  2928 raft_consensus.cc:2272] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:30.391527  2930 tablet_replica.cc:333] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a: stopping tablet replica
I20260504 14:08:30.391644  2930 raft_consensus.cc:2243] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:08:30.391758  2930 raft_consensus.cc:2272] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:30.391782  2928 ts_tablet_manager.cc:1916] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:30.392426  2930 ts_tablet_manager.cc:1916] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:30.395040  2930 ts_tablet_manager.cc:1929] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.5
I20260504 14:08:30.395108  2930 log.cc:1199] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/wal/wals/a3b88886f47c4524a0e07728863f3826
I20260504 14:08:30.395118  2928 ts_tablet_manager.cc:1929] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.5
I20260504 14:08:30.395193  2928 log.cc:1199] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/wal/wals/a3b88886f47c4524a0e07728863f3826
I20260504 14:08:30.395381  2930 ts_tablet_manager.cc:1950] T a3b88886f47c4524a0e07728863f3826 P f04ea53f457241c9aa321af4ab8e451a: Deleting consensus metadata
I20260504 14:08:30.395483  2928 ts_tablet_manager.cc:1950] T a3b88886f47c4524a0e07728863f3826 P b2991172abc5408aa043125b7f6941f9: Deleting consensus metadata
I20260504 14:08:30.396188  2321 catalog_manager.cc:5002] TS f04ea53f457241c9aa321af4ab8e451a (127.25.254.193:33293): tablet a3b88886f47c4524a0e07728863f3826 (table test-table [id=bfcfa999dbb5458dacb3ef59fa7605a2]) successfully deleted
I20260504 14:08:30.396397  2321 catalog_manager.cc:5002] TS b2991172abc5408aa043125b7f6941f9 (127.25.254.194:44849): tablet a3b88886f47c4524a0e07728863f3826 (table test-table [id=bfcfa999dbb5458dacb3ef59fa7605a2]) successfully deleted
I20260504 14:08:30.398363  2929 tablet_replica.cc:333] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: stopping tablet replica
I20260504 14:08:30.398558  2929 raft_consensus.cc:2243] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:08:30.398757  2929 raft_consensus.cc:2272] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:30.399501  2929 ts_tablet_manager.cc:1916] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:30.402459  2929 ts_tablet_manager.cc:1929] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.5
I20260504 14:08:30.402595  2929 log.cc:1199] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/wal/wals/a3b88886f47c4524a0e07728863f3826
I20260504 14:08:30.402949  2929 ts_tablet_manager.cc:1950] T a3b88886f47c4524a0e07728863f3826 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Deleting consensus metadata
I20260504 14:08:30.403733  2321 catalog_manager.cc:5002] TS 3fc252a5c8fc4f69b1c0a18dde0aaddc (127.25.254.195:44789): tablet a3b88886f47c4524a0e07728863f3826 (table test-table [id=bfcfa999dbb5458dacb3ef59fa7605a2]) successfully deleted
May 04 14:08:30 dist-test-slave-2x32 krb5kdc[2289](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903710, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
May 04 14:08:30 dist-test-slave-2x32 krb5kdc[2289](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  test-admin@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:30 dist-test-slave-2x32 krb5kdc[2289](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  test-admin@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
W20260504 14:08:30.445067  2938 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:30.425111 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:55260 (local address 127.25.254.254:39351)
0504 14:08:30.425379 (+   268us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:30.425384 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:30.425488 (+   104us) server_negotiation.cc:408] Connection header received
0504 14:08:30.425660 (+   172us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:30.425664 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:30.425728 (+    64us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:30.425819 (+    91us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:30.427202 (+  1383us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.428256 (+  1054us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:30.429225 (+   969us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.429453 (+   228us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:30.444886 (+ 15433us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.0.0.1:55260: BlockingRecv error: recv got EOF from 127.0.0.1:55260 (error 108)
Metrics: {"server-negotiator.queue_time_us":190,"thread_start_us":83,"threads_started":1}
May 04 14:08:30 dist-test-slave-2x32 krb5kdc[2289](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903710, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM
I20260504 14:08:30.456282  2943 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:30.444643 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:55268 (local address 127.25.254.254:39351)
0504 14:08:30.445270 (+   627us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:30.445275 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:30.445618 (+   343us) server_negotiation.cc:408] Connection header received
0504 14:08:30.445748 (+   130us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:30.445752 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:30.445807 (+    55us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:30.445882 (+    75us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:30.446995 (+  1113us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.447854 (+   859us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:30.449093 (+  1239us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.449326 (+   233us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:30.452452 (+  3126us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:30.452486 (+    34us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:30.452492 (+     6us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:30.452538 (+    46us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:30.454550 (+  2012us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:30.455075 (+   525us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:30.455079 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:30.455081 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:30.455132 (+    51us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:30.455372 (+   240us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:30.455375 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:30.455377 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:30.455559 (+   182us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:30.455685 (+   126us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:30.455922 (+   237us) server_negotiation.cc:300] Negotiation successful
0504 14:08:30.456069 (+   147us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":522,"thread_start_us":87,"threads_started":1}
I20260504 14:08:30.459501  2335 catalog_manager.cc:2257] Servicing CreateTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:55268:
name: "test-table"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "val"
    type: INT32
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
W20260504 14:08:30.460019  2335 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-table in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260504 14:08:30.465560  2476 tablet_service.cc:1511] Processing CreateTablet for tablet 772a37b8411c4ed399d67b9e24192b11 (DEFAULT_TABLE table=test-table [id=2b3a7f7ef69a490ba3796286325c7d72]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:30.465937  2476 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 772a37b8411c4ed399d67b9e24192b11. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:30.466404  2618 tablet_service.cc:1511] Processing CreateTablet for tablet 772a37b8411c4ed399d67b9e24192b11 (DEFAULT_TABLE table=test-table [id=2b3a7f7ef69a490ba3796286325c7d72]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:30.466606  2760 tablet_service.cc:1511] Processing CreateTablet for tablet 772a37b8411c4ed399d67b9e24192b11 (DEFAULT_TABLE table=test-table [id=2b3a7f7ef69a490ba3796286325c7d72]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:30.466692  2618 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 772a37b8411c4ed399d67b9e24192b11. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:30.466874  2760 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 772a37b8411c4ed399d67b9e24192b11. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:30.469386  2945 tablet_bootstrap.cc:492] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a: Bootstrap starting.
I20260504 14:08:30.469813  2947 tablet_bootstrap.cc:492] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Bootstrap starting.
I20260504 14:08:30.470330  2945 tablet_bootstrap.cc:654] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:30.470788  2947 tablet_bootstrap.cc:654] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:30.471514  2945 tablet_bootstrap.cc:492] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a: No bootstrap required, opened a new log
I20260504 14:08:30.471592  2945 ts_tablet_manager.cc:1403] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a: Time spent bootstrapping tablet: real 0.002s	user 0.001s	sys 0.000s
I20260504 14:08:30.471983  2945 raft_consensus.cc:359] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:30.472086  2945 raft_consensus.cc:385] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:30.472113  2945 raft_consensus.cc:740] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f04ea53f457241c9aa321af4ab8e451a, State: Initialized, Role: FOLLOWER
I20260504 14:08:30.472239  2945 consensus_queue.cc:260] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:30.472494  2945 ts_tablet_manager.cc:1434] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.000s
I20260504 14:08:30.472821  2946 tablet_bootstrap.cc:492] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9: Bootstrap starting.
I20260504 14:08:30.473534  2947 tablet_bootstrap.cc:492] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: No bootstrap required, opened a new log
I20260504 14:08:30.473618  2947 ts_tablet_manager.cc:1403] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Time spent bootstrapping tablet: real 0.004s	user 0.002s	sys 0.000s
I20260504 14:08:30.473999  2946 tablet_bootstrap.cc:654] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:30.474056  2947 raft_consensus.cc:359] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:30.474269  2947 raft_consensus.cc:385] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:30.474316  2947 raft_consensus.cc:740] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3fc252a5c8fc4f69b1c0a18dde0aaddc, State: Initialized, Role: FOLLOWER
I20260504 14:08:30.474458  2947 consensus_queue.cc:260] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:30.474721  2947 ts_tablet_manager.cc:1434] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:08:30.476481  2946 tablet_bootstrap.cc:492] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9: No bootstrap required, opened a new log
I20260504 14:08:30.476617  2946 ts_tablet_manager.cc:1403] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9: Time spent bootstrapping tablet: real 0.004s	user 0.000s	sys 0.003s
I20260504 14:08:30.477159  2946 raft_consensus.cc:359] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:30.477330  2946 raft_consensus.cc:385] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:30.477404  2946 raft_consensus.cc:740] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: b2991172abc5408aa043125b7f6941f9, State: Initialized, Role: FOLLOWER
I20260504 14:08:30.477571  2946 consensus_queue.cc:260] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:30.477878  2946 ts_tablet_manager.cc:1434] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.001s
I20260504 14:08:30.762954  2860 raft_consensus.cc:493] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:08:30.763118  2860 raft_consensus.cc:515] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:30.763588  2860 leader_election.cc:290] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers b2991172abc5408aa043125b7f6941f9 (127.25.254.194:44849), 3fc252a5c8fc4f69b1c0a18dde0aaddc (127.25.254.195:44789)
I20260504 14:08:30.764082  2780 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "772a37b8411c4ed399d67b9e24192b11" candidate_uuid: "f04ea53f457241c9aa321af4ab8e451a" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" is_pre_election: true
I20260504 14:08:30.764093  2638 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "772a37b8411c4ed399d67b9e24192b11" candidate_uuid: "f04ea53f457241c9aa321af4ab8e451a" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "b2991172abc5408aa043125b7f6941f9" is_pre_election: true
I20260504 14:08:30.764237  2780 raft_consensus.cc:2468] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate f04ea53f457241c9aa321af4ab8e451a in term 0.
I20260504 14:08:30.764266  2638 raft_consensus.cc:2468] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate f04ea53f457241c9aa321af4ab8e451a in term 0.
I20260504 14:08:30.764581  2427 leader_election.cc:304] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3fc252a5c8fc4f69b1c0a18dde0aaddc, f04ea53f457241c9aa321af4ab8e451a; no voters: 
I20260504 14:08:30.764804  2860 raft_consensus.cc:2804] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:08:30.764865  2860 raft_consensus.cc:493] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:08:30.764948  2860 raft_consensus.cc:3060] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:30.765972  2860 raft_consensus.cc:515] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:30.766371  2860 leader_election.cc:290] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [CANDIDATE]: Term 1 election: Requested vote from peers b2991172abc5408aa043125b7f6941f9 (127.25.254.194:44849), 3fc252a5c8fc4f69b1c0a18dde0aaddc (127.25.254.195:44789)
I20260504 14:08:30.767077  2780 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "772a37b8411c4ed399d67b9e24192b11" candidate_uuid: "f04ea53f457241c9aa321af4ab8e451a" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc"
I20260504 14:08:30.767077  2638 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "772a37b8411c4ed399d67b9e24192b11" candidate_uuid: "f04ea53f457241c9aa321af4ab8e451a" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "b2991172abc5408aa043125b7f6941f9"
I20260504 14:08:30.767210  2780 raft_consensus.cc:3060] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:30.767210  2638 raft_consensus.cc:3060] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:30.768296  2780 raft_consensus.cc:2468] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate f04ea53f457241c9aa321af4ab8e451a in term 1.
I20260504 14:08:30.768296  2638 raft_consensus.cc:2468] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate f04ea53f457241c9aa321af4ab8e451a in term 1.
I20260504 14:08:30.768625  2427 leader_election.cc:304] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: b2991172abc5408aa043125b7f6941f9, f04ea53f457241c9aa321af4ab8e451a; no voters: 
I20260504 14:08:30.768806  2860 raft_consensus.cc:2804] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:08:30.768934  2860 raft_consensus.cc:697] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [term 1 LEADER]: Becoming Leader. State: Replica: f04ea53f457241c9aa321af4ab8e451a, State: Running, Role: LEADER
I20260504 14:08:30.769069  2860 consensus_queue.cc:237] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:30.770963  2335 catalog_manager.cc:5671] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a reported cstate change: term changed from 0 to 1, leader changed from <none> to f04ea53f457241c9aa321af4ab8e451a (127.25.254.193). New cstate: current_term: 1 leader_uuid: "f04ea53f457241c9aa321af4ab8e451a" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } health_report { overall_health: UNKNOWN } } }
I20260504 14:08:30.841941  2953 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:30.839080 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:57630 (local address 127.25.254.193:33293)
0504 14:08:30.839437 (+   357us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:30.839440 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:30.839453 (+    13us) server_negotiation.cc:408] Connection header received
0504 14:08:30.839496 (+    43us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:30.839499 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:30.839545 (+    46us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:30.839647 (+   102us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:08:30.840037 (+   390us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.840525 (+   488us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:30.841206 (+   681us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:30.841398 (+   192us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:30.841438 (+    40us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:08:30.841510 (+    72us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:08:30.841602 (+    92us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:30.841730 (+   128us) server_negotiation.cc:300] Negotiation successful
0504 14:08:30.841783 (+    53us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":275,"thread_start_us":144,"threads_started":1}
I20260504 14:08:30.844224  2780 raft_consensus.cc:1275] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 1 FOLLOWER]: Refusing update from remote peer f04ea53f457241c9aa321af4ab8e451a: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:08:30.844224  2638 raft_consensus.cc:1275] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9 [term 1 FOLLOWER]: Refusing update from remote peer f04ea53f457241c9aa321af4ab8e451a: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:08:30.844756  2913 consensus_queue.cc:1048] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [LEADER]: Connected to new peer: Peer: permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:08:30.844930  2860 consensus_queue.cc:1048] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [LEADER]: Connected to new peer: Peer: permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:08:30.852768  2335 catalog_manager.cc:2507] Servicing SoftDeleteTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:55268:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:08:30.852953  2335 catalog_manager.cc:2755] Servicing DeleteTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:55268:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:08:30.855862  2335 catalog_manager.cc:5958] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516: Sending DeleteTablet for 3 replicas of tablet 772a37b8411c4ed399d67b9e24192b11
I20260504 14:08:30.856503  2476 tablet_service.cc:1558] Processing DeleteTablet for tablet 772a37b8411c4ed399d67b9e24192b11 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:30 UTC) from {username='oryx', principal='oryx/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:57610
I20260504 14:08:30.856544  2618 tablet_service.cc:1558] Processing DeleteTablet for tablet 772a37b8411c4ed399d67b9e24192b11 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:30 UTC) from {username='oryx', principal='oryx/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:49956
I20260504 14:08:30.856726  2930 tablet_replica.cc:333] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a: stopping tablet replica
I20260504 14:08:30.856748  2617 tablet_service.cc:1558] Processing DeleteTablet for tablet 772a37b8411c4ed399d67b9e24192b11 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:30 UTC) from {username='oryx', principal='oryx/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:49956
I20260504 14:08:30.856866  2930 raft_consensus.cc:2243] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [term 1 LEADER]: Raft consensus shutting down.
I20260504 14:08:30.857115  2930 raft_consensus.cc:2272] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:30.857126  2760 tablet_service.cc:1558] Processing DeleteTablet for tablet 772a37b8411c4ed399d67b9e24192b11 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:30 UTC) from {username='oryx', principal='oryx/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:51874
I20260504 14:08:30.857308  2928 tablet_replica.cc:333] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9: stopping tablet replica
I20260504 14:08:30.857348  2929 tablet_replica.cc:333] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: stopping tablet replica
I20260504 14:08:30.857430  2928 raft_consensus.cc:2243] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:08:30.857456  2929 raft_consensus.cc:2243] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:08:30.857568  2929 raft_consensus.cc:2272] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:30.857566  2928 raft_consensus.cc:2272] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:30.858423  2930 ts_tablet_manager.cc:1916] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:30.858423  2928 ts_tablet_manager.cc:1916] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:30.858632  2929 ts_tablet_manager.cc:1916] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:30.860555  2929 ts_tablet_manager.cc:1929] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:08:30.860626  2929 log.cc:1199] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/wal/wals/772a37b8411c4ed399d67b9e24192b11
I20260504 14:08:30.860780  2928 ts_tablet_manager.cc:1929] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:08:30.860780  2930 ts_tablet_manager.cc:1929] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:08:30.860848  2930 log.cc:1199] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-0/wal/wals/772a37b8411c4ed399d67b9e24192b11
I20260504 14:08:30.860848  2928 log.cc:1199] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/wal/wals/772a37b8411c4ed399d67b9e24192b11
I20260504 14:08:30.860894  2929 ts_tablet_manager.cc:1950] T 772a37b8411c4ed399d67b9e24192b11 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Deleting consensus metadata
I20260504 14:08:30.861112  2930 ts_tablet_manager.cc:1950] T 772a37b8411c4ed399d67b9e24192b11 P f04ea53f457241c9aa321af4ab8e451a: Deleting consensus metadata
I20260504 14:08:30.861121  2928 ts_tablet_manager.cc:1950] T 772a37b8411c4ed399d67b9e24192b11 P b2991172abc5408aa043125b7f6941f9: Deleting consensus metadata
I20260504 14:08:30.861608  2321 catalog_manager.cc:5002] TS 3fc252a5c8fc4f69b1c0a18dde0aaddc (127.25.254.195:44789): tablet 772a37b8411c4ed399d67b9e24192b11 (table test-table [id=2b3a7f7ef69a490ba3796286325c7d72]) successfully deleted
I20260504 14:08:30.861985  2321 catalog_manager.cc:5002] TS b2991172abc5408aa043125b7f6941f9 (127.25.254.194:44849): tablet 772a37b8411c4ed399d67b9e24192b11 (table test-table [id=2b3a7f7ef69a490ba3796286325c7d72]) successfully deleted
I20260504 14:08:30.862146  2321 catalog_manager.cc:5002] TS f04ea53f457241c9aa321af4ab8e451a (127.25.254.193:33293): tablet 772a37b8411c4ed399d67b9e24192b11 (table test-table [id=2b3a7f7ef69a490ba3796286325c7d72]) successfully deleted
I20260504 14:08:30.862095  2335 catalog_manager.cc:2257] Servicing CreateTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:55268:
name: "test-table"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "val"
    type: INT32
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
W20260504 14:08:30.862620  2335 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-table in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
W20260504 14:08:30.862818  2321 catalog_manager.cc:4969] TS b2991172abc5408aa043125b7f6941f9 (127.25.254.194:44849): delete failed for tablet 772a37b8411c4ed399d67b9e24192b11 because the tablet was not found. No further retry: Not found: Tablet not found: 772a37b8411c4ed399d67b9e24192b11
I20260504 14:08:30.867913  2476 tablet_service.cc:1511] Processing CreateTablet for tablet f2d175e8ed70467ebfb029a51d6d8249 (DEFAULT_TABLE table=test-table [id=6942e8f70fc1407280bfeda6eeb1a8b8]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:30.868255  2476 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet f2d175e8ed70467ebfb029a51d6d8249. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:30.868209  2617 tablet_service.cc:1511] Processing CreateTablet for tablet f2d175e8ed70467ebfb029a51d6d8249 (DEFAULT_TABLE table=test-table [id=6942e8f70fc1407280bfeda6eeb1a8b8]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:30.868464  2617 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet f2d175e8ed70467ebfb029a51d6d8249. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:30.870497  2760 tablet_service.cc:1511] Processing CreateTablet for tablet f2d175e8ed70467ebfb029a51d6d8249 (DEFAULT_TABLE table=test-table [id=6942e8f70fc1407280bfeda6eeb1a8b8]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:30.870703  2945 tablet_bootstrap.cc:492] T f2d175e8ed70467ebfb029a51d6d8249 P f04ea53f457241c9aa321af4ab8e451a: Bootstrap starting.
I20260504 14:08:30.870872  2760 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet f2d175e8ed70467ebfb029a51d6d8249. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:30.871500  2945 tablet_bootstrap.cc:654] T f2d175e8ed70467ebfb029a51d6d8249 P f04ea53f457241c9aa321af4ab8e451a: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:30.872022  2946 tablet_bootstrap.cc:492] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9: Bootstrap starting.
I20260504 14:08:30.872745  2947 tablet_bootstrap.cc:492] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Bootstrap starting.
I20260504 14:08:30.872845  2946 tablet_bootstrap.cc:654] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:30.873744  2945 tablet_bootstrap.cc:492] T f2d175e8ed70467ebfb029a51d6d8249 P f04ea53f457241c9aa321af4ab8e451a: No bootstrap required, opened a new log
I20260504 14:08:30.873786  2947 tablet_bootstrap.cc:654] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:30.873854  2945 ts_tablet_manager.cc:1403] T f2d175e8ed70467ebfb029a51d6d8249 P f04ea53f457241c9aa321af4ab8e451a: Time spent bootstrapping tablet: real 0.003s	user 0.001s	sys 0.000s
I20260504 14:08:30.874040  2946 tablet_bootstrap.cc:492] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9: No bootstrap required, opened a new log
I20260504 14:08:30.874142  2946 ts_tablet_manager.cc:1403] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9: Time spent bootstrapping tablet: real 0.002s	user 0.000s	sys 0.001s
I20260504 14:08:30.874394  2945 raft_consensus.cc:359] T f2d175e8ed70467ebfb029a51d6d8249 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:30.874534  2945 raft_consensus.cc:385] T f2d175e8ed70467ebfb029a51d6d8249 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:30.874574  2945 raft_consensus.cc:740] T f2d175e8ed70467ebfb029a51d6d8249 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f04ea53f457241c9aa321af4ab8e451a, State: Initialized, Role: FOLLOWER
I20260504 14:08:30.874637  2946 raft_consensus.cc:359] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:30.874756  2946 raft_consensus.cc:385] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:30.874735  2945 consensus_queue.cc:260] T f2d175e8ed70467ebfb029a51d6d8249 P f04ea53f457241c9aa321af4ab8e451a [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:30.874809  2946 raft_consensus.cc:740] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: b2991172abc5408aa043125b7f6941f9, State: Initialized, Role: FOLLOWER
I20260504 14:08:30.874930  2946 consensus_queue.cc:260] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:30.875223  2946 ts_tablet_manager.cc:1434] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.001s
I20260504 14:08:30.875310  2947 tablet_bootstrap.cc:492] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: No bootstrap required, opened a new log
I20260504 14:08:30.875397  2947 ts_tablet_manager.cc:1403] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Time spent bootstrapping tablet: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:08:30.875880  2947 raft_consensus.cc:359] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:30.876012  2947 raft_consensus.cc:385] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:30.876048  2947 raft_consensus.cc:740] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3fc252a5c8fc4f69b1c0a18dde0aaddc, State: Initialized, Role: FOLLOWER
I20260504 14:08:30.876163  2947 consensus_queue.cc:260] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:30.876466  2947 ts_tablet_manager.cc:1434] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.000s
I20260504 14:08:30.875056  2945 ts_tablet_manager.cc:1434] T f2d175e8ed70467ebfb029a51d6d8249 P f04ea53f457241c9aa321af4ab8e451a: Time spent starting tablet: real 0.001s	user 0.002s	sys 0.000s
I20260504 14:08:31.080248  2861 raft_consensus.cc:493] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:08:31.080499  2861 raft_consensus.cc:515] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:31.081661  2861 leader_election.cc:290] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers f04ea53f457241c9aa321af4ab8e451a (127.25.254.193:33293), b2991172abc5408aa043125b7f6941f9 (127.25.254.194:44849)
I20260504 14:08:31.085271  2959 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:31.081982 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:33293 (local address 127.25.254.195:55695)
0504 14:08:31.082330 (+   348us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:31.082347 (+    17us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:31.082506 (+   159us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:31.082822 (+   316us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:31.082827 (+     5us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:31.082855 (+    28us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:31.083205 (+   350us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:31.083213 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:31.084089 (+   876us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:31.084093 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:31.084851 (+   758us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:31.084858 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:31.084954 (+    96us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:31.084968 (+    14us) client_negotiation.cc:770] Sending connection context
0504 14:08:31.085019 (+    51us) client_negotiation.cc:241] Negotiation successful
0504 14:08:31.085060 (+    41us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":262,"spinlock_wait_cycles":48256,"thread_start_us":102,"threads_started":1}
I20260504 14:08:31.085778  2953 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:31.082110 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:55695 (local address 127.25.254.193:33293)
0504 14:08:31.082307 (+   197us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:31.082312 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:31.082442 (+   130us) server_negotiation.cc:408] Connection header received
0504 14:08:31.082629 (+   187us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:31.082633 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:31.082697 (+    64us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:31.082786 (+    89us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:31.083334 (+   548us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:31.083965 (+   631us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:31.085001 (+  1036us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:31.085482 (+   481us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:31.085516 (+    34us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:31.085567 (+    51us) server_negotiation.cc:300] Negotiation successful
0504 14:08:31.085660 (+    93us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":90}
I20260504 14:08:31.085831  2960 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:31.082454 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:44849 (local address 127.25.254.195:52801)
0504 14:08:31.082815 (+   361us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:31.082828 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:31.082937 (+   109us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:31.083226 (+   289us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:31.083228 (+     2us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:31.083242 (+    14us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:31.083526 (+   284us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:31.083533 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:31.084445 (+   912us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:31.084448 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:31.085180 (+   732us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:31.085190 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:31.085320 (+   130us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:31.085570 (+   250us) client_negotiation.cc:770] Sending connection context
0504 14:08:31.085638 (+    68us) client_negotiation.cc:241] Negotiation successful
0504 14:08:31.085698 (+    60us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":277,"spinlock_wait_cycles":896,"thread_start_us":88,"threads_started":1}
I20260504 14:08:31.086131  2961 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:31.082529 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:52801 (local address 127.25.254.194:44849)
0504 14:08:31.082831 (+   302us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:31.082837 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:31.082902 (+    65us) server_negotiation.cc:408] Connection header received
0504 14:08:31.083063 (+   161us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:31.083066 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:31.083118 (+    52us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:31.083212 (+    94us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:31.083667 (+   455us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:31.084323 (+   656us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:31.085316 (+   993us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:31.085787 (+   471us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:31.085873 (+    86us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:31.085929 (+    56us) server_negotiation.cc:300] Negotiation successful
0504 14:08:31.085983 (+    54us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":201,"thread_start_us":91,"threads_started":1}
I20260504 14:08:31.086326  2495 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "f2d175e8ed70467ebfb029a51d6d8249" candidate_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f04ea53f457241c9aa321af4ab8e451a" is_pre_election: true
I20260504 14:08:31.086495  2495 raft_consensus.cc:2468] T f2d175e8ed70467ebfb029a51d6d8249 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 3fc252a5c8fc4f69b1c0a18dde0aaddc in term 0.
I20260504 14:08:31.086575  2638 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "f2d175e8ed70467ebfb029a51d6d8249" candidate_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "b2991172abc5408aa043125b7f6941f9" is_pre_election: true
I20260504 14:08:31.086753  2638 raft_consensus.cc:2468] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 3fc252a5c8fc4f69b1c0a18dde0aaddc in term 0.
I20260504 14:08:31.086902  2711 leader_election.cc:304] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3fc252a5c8fc4f69b1c0a18dde0aaddc, f04ea53f457241c9aa321af4ab8e451a; no voters: 
I20260504 14:08:31.087139  2861 raft_consensus.cc:2804] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:08:31.087204  2861 raft_consensus.cc:493] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:08:31.087244  2861 raft_consensus.cc:3060] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:31.088161  2861 raft_consensus.cc:515] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:31.088523  2861 leader_election.cc:290] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [CANDIDATE]: Term 1 election: Requested vote from peers f04ea53f457241c9aa321af4ab8e451a (127.25.254.193:33293), b2991172abc5408aa043125b7f6941f9 (127.25.254.194:44849)
I20260504 14:08:31.089046  2495 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "f2d175e8ed70467ebfb029a51d6d8249" candidate_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f04ea53f457241c9aa321af4ab8e451a"
I20260504 14:08:31.089046  2638 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "f2d175e8ed70467ebfb029a51d6d8249" candidate_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "b2991172abc5408aa043125b7f6941f9"
I20260504 14:08:31.089174  2638 raft_consensus.cc:3060] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:31.089174  2495 raft_consensus.cc:3060] T f2d175e8ed70467ebfb029a51d6d8249 P f04ea53f457241c9aa321af4ab8e451a [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:31.090279  2638 raft_consensus.cc:2468] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 3fc252a5c8fc4f69b1c0a18dde0aaddc in term 1.
I20260504 14:08:31.090279  2495 raft_consensus.cc:2468] T f2d175e8ed70467ebfb029a51d6d8249 P f04ea53f457241c9aa321af4ab8e451a [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 3fc252a5c8fc4f69b1c0a18dde0aaddc in term 1.
I20260504 14:08:31.090590  2711 leader_election.cc:304] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3fc252a5c8fc4f69b1c0a18dde0aaddc, b2991172abc5408aa043125b7f6941f9; no voters: 
I20260504 14:08:31.090822  2861 raft_consensus.cc:2804] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:08:31.091099  2861 raft_consensus.cc:697] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 1 LEADER]: Becoming Leader. State: Replica: 3fc252a5c8fc4f69b1c0a18dde0aaddc, State: Running, Role: LEADER
I20260504 14:08:31.091372  2861 consensus_queue.cc:237] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } }
I20260504 14:08:31.093782  2337 catalog_manager.cc:5671] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc reported cstate change: term changed from 0 to 1, leader changed from <none> to 3fc252a5c8fc4f69b1c0a18dde0aaddc (127.25.254.195). New cstate: current_term: 1 leader_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "3fc252a5c8fc4f69b1c0a18dde0aaddc" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 44789 } health_report { overall_health: HEALTHY } } }
I20260504 14:08:31.122711  2964 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:31.119121 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:51888 (local address 127.25.254.195:44789)
0504 14:08:31.119439 (+   318us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:31.119443 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:31.119455 (+    12us) server_negotiation.cc:408] Connection header received
0504 14:08:31.119526 (+    71us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:31.119528 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:31.119586 (+    58us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:31.119751 (+   165us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:08:31.120121 (+   370us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:31.120763 (+   642us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:31.121400 (+   637us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:31.121568 (+   168us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:31.121619 (+    51us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:08:31.122056 (+   437us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:08:31.122226 (+   170us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:31.122365 (+   139us) server_negotiation.cc:300] Negotiation successful
0504 14:08:31.122414 (+    49us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":237,"thread_start_us":141,"threads_started":1}
I20260504 14:08:31.132938  2953 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:31.129930 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:53031 (local address 127.25.254.193:33293)
0504 14:08:31.130088 (+   158us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:31.130092 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:31.130131 (+    39us) server_negotiation.cc:408] Connection header received
0504 14:08:31.130378 (+   247us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:31.130381 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:31.130433 (+    52us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:31.130504 (+    71us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:08:31.131028 (+   524us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:31.131508 (+   480us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:31.132246 (+   738us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:31.132408 (+   162us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:31.132450 (+    42us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:08:31.132533 (+    83us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:08:31.132603 (+    70us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:31.132780 (+   177us) server_negotiation.cc:300] Negotiation successful
0504 14:08:31.132828 (+    48us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":65}
I20260504 14:08:31.132942  2967 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:31.129675 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:33293 (local address 127.25.254.195:53031)
0504 14:08:31.130064 (+   389us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:31.130077 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:31.130227 (+   150us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:31.130602 (+   375us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:31.130605 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:31.130612 (+     7us) client_negotiation.cc:190] Negotiated authn=TOKEN
0504 14:08:31.130870 (+   258us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:31.130876 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:31.131635 (+   759us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:31.131639 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:31.132110 (+   471us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:31.132118 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:31.132215 (+    97us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:31.132252 (+    37us) client_negotiation.cc:253] Sending TOKEN_EXCHANGE NegotiatePB request
0504 14:08:31.132646 (+   394us) client_negotiation.cc:272] Received TOKEN_EXCHANGE NegotiatePB response
0504 14:08:31.132653 (+     7us) client_negotiation.cc:770] Sending connection context
0504 14:08:31.132730 (+    77us) client_negotiation.cc:241] Negotiation successful
0504 14:08:31.132795 (+    65us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":242,"thread_start_us":128,"threads_started":1}
I20260504 14:08:31.139668  2495 raft_consensus.cc:1275] T f2d175e8ed70467ebfb029a51d6d8249 P f04ea53f457241c9aa321af4ab8e451a [term 1 FOLLOWER]: Refusing update from remote peer 3fc252a5c8fc4f69b1c0a18dde0aaddc: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:08:31.139746  2638 raft_consensus.cc:1275] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9 [term 1 FOLLOWER]: Refusing update from remote peer 3fc252a5c8fc4f69b1c0a18dde0aaddc: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:08:31.140268  2861 consensus_queue.cc:1048] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [LEADER]: Connected to new peer: Peer: permanent_uuid: "b2991172abc5408aa043125b7f6941f9" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 44849 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:08:31.140482  2962 consensus_queue.cc:1048] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [LEADER]: Connected to new peer: Peer: permanent_uuid: "f04ea53f457241c9aa321af4ab8e451a" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 33293 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:08:31.159533  2972 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:31.156309 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:44789 (local address 127.25.254.193:37565)
0504 14:08:31.156547 (+   238us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:31.156561 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:31.156652 (+    91us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:31.156925 (+   273us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:31.156928 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:31.156941 (+    13us) client_negotiation.cc:190] Negotiated authn=TOKEN
0504 14:08:31.157205 (+   264us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:31.157214 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:31.158082 (+   868us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:31.158086 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:31.158565 (+   479us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:31.158573 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:31.158666 (+    93us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:31.158686 (+    20us) client_negotiation.cc:253] Sending TOKEN_EXCHANGE NegotiatePB request
0504 14:08:31.159231 (+   545us) client_negotiation.cc:272] Received TOKEN_EXCHANGE NegotiatePB response
0504 14:08:31.159237 (+     6us) client_negotiation.cc:770] Sending connection context
0504 14:08:31.159312 (+    75us) client_negotiation.cc:241] Negotiation successful
0504 14:08:31.159379 (+    67us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":143,"thread_start_us":63,"threads_started":1}
I20260504 14:08:31.159572  2964 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:31.156485 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:37565 (local address 127.25.254.195:44789)
0504 14:08:31.156616 (+   131us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:31.156620 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:31.156636 (+    16us) server_negotiation.cc:408] Connection header received
0504 14:08:31.156778 (+   142us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:31.156780 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:31.156827 (+    47us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:31.156890 (+    63us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:08:31.157432 (+   542us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:31.157971 (+   539us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:31.158742 (+   771us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:31.158953 (+   211us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:31.159028 (+    75us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:08:31.159113 (+    85us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:08:31.159200 (+    87us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:31.159362 (+   162us) server_negotiation.cc:300] Negotiation successful
0504 14:08:31.159407 (+    45us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":48}
I20260504 14:08:31.183225  2337 catalog_manager.cc:2507] Servicing SoftDeleteTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:55268:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:08:31.183403  2337 catalog_manager.cc:2755] Servicing DeleteTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:55268:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:08:31.185619  2337 catalog_manager.cc:5958] T 00000000000000000000000000000000 P 70b2453e2a074974bbdb6e971a688516: Sending DeleteTablet for 3 replicas of tablet f2d175e8ed70467ebfb029a51d6d8249
I20260504 14:08:31.186285  2476 tablet_service.cc:1558] Processing DeleteTablet for tablet f2d175e8ed70467ebfb029a51d6d8249 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:31 UTC) from {username='oryx', principal='oryx/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:57610
I20260504 14:08:31.186415  2617 tablet_service.cc:1558] Processing DeleteTablet for tablet f2d175e8ed70467ebfb029a51d6d8249 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:31 UTC) from {username='oryx', principal='oryx/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:49956
I20260504 14:08:31.186494  2930 tablet_replica.cc:333] T f2d175e8ed70467ebfb029a51d6d8249 P f04ea53f457241c9aa321af4ab8e451a: stopping tablet replica
I20260504 14:08:31.186579  2928 tablet_replica.cc:333] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9: stopping tablet replica
I20260504 14:08:31.186631  2930 raft_consensus.cc:2243] T f2d175e8ed70467ebfb029a51d6d8249 P f04ea53f457241c9aa321af4ab8e451a [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:08:31.186580  2760 tablet_service.cc:1558] Processing DeleteTablet for tablet f2d175e8ed70467ebfb029a51d6d8249 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:31 UTC) from {username='oryx', principal='oryx/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:51874
I20260504 14:08:31.186758  2930 raft_consensus.cc:2272] T f2d175e8ed70467ebfb029a51d6d8249 P f04ea53f457241c9aa321af4ab8e451a [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:31.186748  2928 raft_consensus.cc:2243] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:08:31.186815  2929 tablet_replica.cc:333] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: stopping tablet replica
I20260504 14:08:31.186861  2928 raft_consensus.cc:2272] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:31.186934  2929 raft_consensus.cc:2243] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 1 LEADER]: Raft consensus shutting down.
I20260504 14:08:31.187237  2929 raft_consensus.cc:2272] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:31.187565  2930 ts_tablet_manager.cc:1916] T f2d175e8ed70467ebfb029a51d6d8249 P f04ea53f457241c9aa321af4ab8e451a: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:31.187565  2928 ts_tablet_manager.cc:1916] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:31.188241  2929 ts_tablet_manager.cc:1916] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:31.189970  2928 ts_tablet_manager.cc:1929] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.5
I20260504 14:08:31.190047  2928 log.cc:1199] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-1/wal/wals/f2d175e8ed70467ebfb029a51d6d8249
I20260504 14:08:31.190358  2928 ts_tablet_manager.cc:1950] T f2d175e8ed70467ebfb029a51d6d8249 P b2991172abc5408aa043125b7f6941f9: Deleting consensus metadata
I20260504 14:08:31.190616  2929 ts_tablet_manager.cc:1929] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.5
I20260504 14:08:31.190686  2929 log.cc:1199] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonDefaultPrincipal.1777903638260922-26619-0/minicluster-data/ts-2/wal/wals/f2d175e8ed70467ebfb029a51d6d8249
I20260504 14:08:31.190982 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 2411
I20260504 14:08:31.191011  2929 ts_tablet_manager.cc:1950] T f2d175e8ed70467ebfb029a51d6d8249 P 3fc252a5c8fc4f69b1c0a18dde0aaddc: Deleting consensus metadata
I20260504 14:08:31.192606  2321 catalog_manager.cc:5002] TS b2991172abc5408aa043125b7f6941f9 (127.25.254.194:44849): tablet f2d175e8ed70467ebfb029a51d6d8249 (table test-table [id=6942e8f70fc1407280bfeda6eeb1a8b8]) successfully deleted
I20260504 14:08:31.193377  2321 catalog_manager.cc:5002] TS 3fc252a5c8fc4f69b1c0a18dde0aaddc (127.25.254.195:44789): tablet f2d175e8ed70467ebfb029a51d6d8249 (table test-table [id=6942e8f70fc1407280bfeda6eeb1a8b8]) successfully deleted
W20260504 14:08:31.201902  2321 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv got EOF from 127.25.254.193:33293 (error 108)
I20260504 14:08:31.202247 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 2553
I20260504 14:08:31.204113  2976 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:31.203642 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:33293 (local address 127.0.0.1:57640)
0504 14:08:31.203895 (+   253us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:31.203973 (+    78us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.193:33293: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":160,"thread_start_us":93,"threads_started":1}
W20260504 14:08:31.204478  2321 catalog_manager.cc:4729] TS f04ea53f457241c9aa321af4ab8e451a (127.25.254.193:33293): DeleteTablet:TABLET_DATA_DELETED RPC failed for tablet f2d175e8ed70467ebfb029a51d6d8249: Network error: Client connection negotiation failed: client connection to 127.25.254.193:33293: connect: Connection refused (error 111)
I20260504 14:08:31.211966 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 2695
I20260504 14:08:31.220335 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 2305
2026-05-04T14:08:31Z chronyd exiting
[       OK ] SecurityITest.TestNonDefaultPrincipal (4899 ms)
[ RUN      ] SecurityITest.TestNonExistentPrincipal
Loading random data
Initializing database '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/krb5kdc/principal' for realm 'KRBTEST.COM',
master key name 'K/M@KRBTEST.COM'
May 04 14:08:31 dist-test-slave-2x32 krb5kdc[2979](info): setting up network...
krb5kdc: setsockopt(10,IPV6_V6ONLY,1) worked
May 04 14:08:31 dist-test-slave-2x32 krb5kdc[2979](info): set up 2 sockets
May 04 14:08:31 dist-test-slave-2x32 krb5kdc[2979](info): commencing operation
krb5kdc: starting...
W20260504 14:08:33.255182 26619 mini_kdc.cc:121] Time spent starting KDC: real 2.011s	user 0.000s	sys 0.006s
WARNING: no policy specified for test-admin@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-admin@KRBTEST.COM" created.
WARNING: no policy specified for test-user@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-user@KRBTEST.COM" created.
WARNING: no policy specified for joe-interloper@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "joe-interloper@KRBTEST.COM" created.
Authenticating as principal slave/admin@KRBTEST.COM with password.
Entry for principal test-user with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/krb5kdc/test-user.keytab.
Entry for principal test-user with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/krb5kdc/test-user.keytab.
May 04 14:08:33 dist-test-slave-2x32 krb5kdc[2979](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903713, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
2026-05-04T14:08:33Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:08:33Z Disabled control of system clock
WARNING: no policy specified for kudu/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:08:33.406841 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:42373
--webserver_interface=127.25.254.254
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:35933
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:42373
--encrypt_data_at_rest=true
--rpc_trace_negotiation
--principal=oryx with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:33.511473  2995 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:33.511714  2995 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:33.511771  2995 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:33.515292  2995 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:08:33.515375  2995 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:33.515399  2995 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:33.515419  2995 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:08:33.515436  2995 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:08:33.519953  2995 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:35933
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:42373
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=oryx
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:42373
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.2995
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestNonExistentPrincipal.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:33.521169  2995 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:33.522063  2995 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:33.527936  3001 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:33.528077  3003 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:33.527940  3000 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:33.528000  2995 server_base.cc:1061] running on GCE node
I20260504 14:08:33.528586  2995 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:33.529467  2995 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:33.530637  2995 hybrid_clock.cc:648] HybridClock initialized: now 1777903713530608 us; error 42 us; skew 500 ppm
W20260504 14:08:33.533723  2995 builtin_ntp.cc:688] coult not shutdown socket: Network error: shutdown error: Transport endpoint is not connected (error 107)
Runtime error: kinit failed: unable to login from keytab: Keytab contains no suitable keys for oryx@KRBTEST.COM
2026-05-04T14:08:33Z chronyd exiting
[       OK ] SecurityITest.TestNonExistentPrincipal (2312 ms)
[ RUN      ] SecurityITest.TestMismatchingPrincipals
Loading random data
Initializing database '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/principal' for realm 'KRBTEST.COM',
master key name 'K/M@KRBTEST.COM'
May 04 14:08:33 dist-test-slave-2x32 krb5kdc[3008](info): setting up network...
krb5kdc: setsockopt(10,IPV6_V6ONLY,1) worked
May 04 14:08:33 dist-test-slave-2x32 krb5kdc[3008](info): set up 2 sockets
May 04 14:08:33 dist-test-slave-2x32 krb5kdc[3008](info): commencing operation
krb5kdc: starting...
W20260504 14:08:35.583114 26619 mini_kdc.cc:121] Time spent starting KDC: real 2.026s	user 0.003s	sys 0.003s
WARNING: no policy specified for test-admin@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-admin@KRBTEST.COM" created.
WARNING: no policy specified for test-user@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-user@KRBTEST.COM" created.
WARNING: no policy specified for joe-interloper@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "joe-interloper@KRBTEST.COM" created.
Authenticating as principal slave/admin@KRBTEST.COM with password.
Entry for principal test-user with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/test-user.keytab.
Entry for principal test-user with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/test-user.keytab.
May 04 14:08:35 dist-test-slave-2x32 krb5kdc[3008](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903715, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
2026-05-04T14:08:35Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:08:35Z Disabled control of system clock
WARNING: no policy specified for kudu/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:08:35.742144 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:35629
--webserver_interface=127.25.254.254
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:39769
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:35629
--encrypt_data_at_rest=true
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:35.847434  3024 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:35.847726  3024 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:35.847841  3024 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:35.851367  3024 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:08:35.851464  3024 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:35.851507  3024 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:35.851570  3024 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:08:35.851611  3024 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:08:35.856349  3024 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:39769
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:35629
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:35629
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.3024
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:35.857683  3024 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:35.858616  3024 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:35.864532  3032 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:35.864518  3029 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:35.864518  3030 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:35.864606  3024 server_base.cc:1061] running on GCE node
I20260504 14:08:35.865315  3024 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:35.866292  3024 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:35.867460  3024 hybrid_clock.cc:648] HybridClock initialized: now 1777903715867447 us; error 28 us; skew 500 ppm
May 04 14:08:35 dist-test-slave-2x32 krb5kdc[3008](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903715, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:35.870144  3024 init.cc:377] Logged in from keytab as kudu/127.25.254.254@KRBTEST.COM (short username kudu)
I20260504 14:08:35.871305  3024 webserver.cc:492] Webserver started at http://127.25.254.254:33027/ using document root <none> and password file <none>
I20260504 14:08:35.871951  3024 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:35.872006  3024 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:35.872236  3024 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:35.874014  3024 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "9f298813135f4a2688894275d165d989"
format_stamp: "Formatted at 2026-05-04 14:08:35 on dist-test-slave-2x32"
server_key: "9a8f0760b5aa7f28a1cdb90cc648e356"
server_key_iv: "ece438222775b74fb1052ff00b47e2f0"
server_key_version: "encryptionkey@0"
I20260504 14:08:35.874599  3024 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "9f298813135f4a2688894275d165d989"
format_stamp: "Formatted at 2026-05-04 14:08:35 on dist-test-slave-2x32"
server_key: "9a8f0760b5aa7f28a1cdb90cc648e356"
server_key_iv: "ece438222775b74fb1052ff00b47e2f0"
server_key_version: "encryptionkey@0"
I20260504 14:08:35.878319  3024 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.003s	sys 0.000s
I20260504 14:08:35.881023  3039 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:35.882428  3024 fs_manager.cc:730] Time spent opening block manager: real 0.003s	user 0.003s	sys 0.000s
I20260504 14:08:35.882580  3024 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "9f298813135f4a2688894275d165d989"
format_stamp: "Formatted at 2026-05-04 14:08:35 on dist-test-slave-2x32"
server_key: "9a8f0760b5aa7f28a1cdb90cc648e356"
server_key_iv: "ece438222775b74fb1052ff00b47e2f0"
server_key_version: "encryptionkey@0"
I20260504 14:08:35.882711  3024 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:35.908982  3024 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:35.915817  3024 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:35.916152  3024 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:35.924425  3091 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:35629 every 8 connection(s)
I20260504 14:08:35.924427  3024 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:35629
I20260504 14:08:35.925606  3024 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:08:35.928634  3092 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:35.928944 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 3024
I20260504 14:08:35.929183 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:08:35.929500 26619 external_mini_cluster.cc:1468] Setting key b0a52d4a9f8055028be79326ec62c97c
I20260504 14:08:35.935575  3092 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989: Bootstrap starting.
May 04 14:08:35 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903715, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:08:35.938360  3092 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:35.939242  3092 log.cc:826] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:35.941416  3092 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989: No bootstrap required, opened a new log
I20260504 14:08:35.943848  3095 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:35.931043 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:42216 (local address 127.25.254.254:35629)
0504 14:08:35.931482 (+   439us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:35.931491 (+     9us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:35.931524 (+    33us) server_negotiation.cc:408] Connection header received
0504 14:08:35.932179 (+   655us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:35.932199 (+    20us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:35.932474 (+   275us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:35.932753 (+   279us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:35.933880 (+  1127us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:35.934707 (+   827us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:35.935554 (+   847us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:35.935838 (+   284us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:35.938562 (+  2724us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:35.938580 (+    18us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:35.938592 (+    12us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:35.938623 (+    31us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:35.940676 (+  2053us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:35.941342 (+   666us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:35.941347 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:35.941352 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:35.941428 (+    76us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:35.941899 (+   471us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:35.941903 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:35.941905 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:35.942355 (+   450us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:35.942560 (+   205us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:35.942980 (+   420us) server_negotiation.cc:300] Negotiation successful
0504 14:08:35.943233 (+   253us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":276,"thread_start_us":156,"threads_started":1}
I20260504 14:08:35.944224  3092 raft_consensus.cc:359] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9f298813135f4a2688894275d165d989" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 35629 } }
I20260504 14:08:35.944448  3092 raft_consensus.cc:385] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:35.944530  3092 raft_consensus.cc:740] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 9f298813135f4a2688894275d165d989, State: Initialized, Role: FOLLOWER
I20260504 14:08:35.945024  3092 consensus_queue.cc:260] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9f298813135f4a2688894275d165d989" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 35629 } }
I20260504 14:08:35.945184  3092 raft_consensus.cc:399] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:08:35.945256  3092 raft_consensus.cc:493] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:08:35.945371  3092 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:35.946453  3092 raft_consensus.cc:515] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9f298813135f4a2688894275d165d989" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 35629 } }
I20260504 14:08:35.946815  3092 leader_election.cc:304] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 9f298813135f4a2688894275d165d989; no voters: 
I20260504 14:08:35.947113  3092 leader_election.cc:290] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:08:35.947278  3097 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:08:35.947540  3097 raft_consensus.cc:697] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [term 1 LEADER]: Becoming Leader. State: Replica: 9f298813135f4a2688894275d165d989, State: Running, Role: LEADER
I20260504 14:08:35.947893  3097 consensus_queue.cc:237] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9f298813135f4a2688894275d165d989" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 35629 } }
I20260504 14:08:35.948251  3092 sys_catalog.cc:565] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:08:35.949465  3099 sys_catalog.cc:455] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 9f298813135f4a2688894275d165d989. Latest consensus state: current_term: 1 leader_uuid: "9f298813135f4a2688894275d165d989" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9f298813135f4a2688894275d165d989" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 35629 } } }
I20260504 14:08:35.949600  3099 sys_catalog.cc:458] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:35.949988  3106 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:08:35.949988  3098 sys_catalog.cc:455] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "9f298813135f4a2688894275d165d989" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9f298813135f4a2688894275d165d989" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 35629 } } }
I20260504 14:08:35.950199  3098 sys_catalog.cc:458] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:35.952728  3106 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:08:35.958056  3106 catalog_manager.cc:1357] Generated new cluster ID: 82c47af7a952427287680c0c13d8c055
I20260504 14:08:35.958139  3106 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:08:35.966287  3106 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:08:35.967206  3106 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:08:35.974746  3106 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989: Generated new TSK 0
I20260504 14:08:35.975391  3106 catalog_manager.cc:1524] Initializing in-progress tserver states...
WARNING: no policy specified for kudu/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:08:36.039412 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:0
--local_ip_for_outbound_sockets=127.25.254.193
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:35629
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:39769
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:36.142788  3120 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:36.143009  3120 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:36.143067  3120 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:36.146461  3120 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:36.146533  3120 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:36.146613  3120 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:08:36.151081  3120 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:39769
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:35629
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.3120
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:36.152173  3120 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:36.152958  3120 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:36.159479  3125 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:36.159512  3126 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:36.159474  3128 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:36.159940  3120 server_base.cc:1061] running on GCE node
I20260504 14:08:36.160320  3120 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:36.160924  3120 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:36.162132  3120 hybrid_clock.cc:648] HybridClock initialized: now 1777903716162111 us; error 40 us; skew 500 ppm
May 04 14:08:36 dist-test-slave-2x32 krb5kdc[3008](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903716, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:36.165213  3120 init.cc:377] Logged in from keytab as kudu/127.25.254.193@KRBTEST.COM (short username kudu)
I20260504 14:08:36.166369  3120 webserver.cc:492] Webserver started at http://127.25.254.193:38599/ using document root <none> and password file <none>
I20260504 14:08:36.166953  3120 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:36.167007  3120 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:36.167217  3120 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:36.168987  3120 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/data/instance:
uuid: "e4a3b4212eeb488e90efe81a7c03c2d9"
format_stamp: "Formatted at 2026-05-04 14:08:36 on dist-test-slave-2x32"
server_key: "2784d45425de1193a480887460940b5a"
server_key_iv: "5c2d25463974902439480af0d7ee35bf"
server_key_version: "encryptionkey@0"
I20260504 14:08:36.169467  3120 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance:
uuid: "e4a3b4212eeb488e90efe81a7c03c2d9"
format_stamp: "Formatted at 2026-05-04 14:08:36 on dist-test-slave-2x32"
server_key: "2784d45425de1193a480887460940b5a"
server_key_iv: "5c2d25463974902439480af0d7ee35bf"
server_key_version: "encryptionkey@0"
I20260504 14:08:36.172900  3120 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.002s	sys 0.004s
I20260504 14:08:36.175176  3135 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:36.176215  3120 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:08:36.176309  3120 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "e4a3b4212eeb488e90efe81a7c03c2d9"
format_stamp: "Formatted at 2026-05-04 14:08:36 on dist-test-slave-2x32"
server_key: "2784d45425de1193a480887460940b5a"
server_key_iv: "5c2d25463974902439480af0d7ee35bf"
server_key_version: "encryptionkey@0"
I20260504 14:08:36.176390  3120 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:36.211766  3120 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:36.215067  3120 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:36.215297  3120 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:36.215853  3120 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:36.216756  3120 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:36.216806  3120 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:36.216845  3120 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:36.216861  3120 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:36.226569  3120 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:41285
I20260504 14:08:36.226594  3248 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:41285 every 8 connection(s)
I20260504 14:08:36.227614  3120 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
I20260504 14:08:36.235698 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 3120
I20260504 14:08:36.235819 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance
I20260504 14:08:36.236106 26619 external_mini_cluster.cc:1468] Setting key 0daefe7e0ff43bb98eaaa25e4abe2170
May 04 14:08:36 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903716, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:08:36.241094  3095 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:36.229616 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:34175 (local address 127.25.254.254:35629)
0504 14:08:36.229896 (+   280us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:36.229901 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:36.230540 (+   639us) server_negotiation.cc:408] Connection header received
0504 14:08:36.231474 (+   934us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:36.231479 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:36.231544 (+    65us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:36.231655 (+   111us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:36.233350 (+  1695us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:36.233865 (+   515us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:36.234553 (+   688us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:36.234746 (+   193us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:36.237262 (+  2516us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:36.237290 (+    28us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:36.237292 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:36.237320 (+    28us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:36.238938 (+  1618us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:36.239505 (+   567us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:36.239509 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:36.239511 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:36.239565 (+    54us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:36.239895 (+   330us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:36.239899 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:36.239901 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:36.240082 (+   181us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:36.240197 (+   115us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:36.240847 (+   650us) server_negotiation.cc:300] Negotiation successful
0504 14:08:36.240960 (+   113us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":143}
I20260504 14:08:36.242019  3251 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:36.229882 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.193:34175)
0504 14:08:36.230392 (+   510us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:36.230428 (+    36us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:36.231254 (+   826us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:36.231841 (+   587us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:36.231851 (+    10us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:36.232260 (+   409us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:36.233161 (+   901us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:36.233177 (+    16us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:36.234003 (+   826us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:36.234007 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:36.234422 (+   415us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:36.234429 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:36.234658 (+   229us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:36.235320 (+   662us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:36.235340 (+    20us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:36.237093 (+  1753us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:36.239102 (+  2009us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:36.239109 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:36.239128 (+    19us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:36.239389 (+   261us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:36.239681 (+   292us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:36.239684 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:36.239685 (+     1us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:36.239794 (+   109us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:36.240209 (+   415us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:36.240218 (+     9us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:36.240512 (+   294us) client_negotiation.cc:770] Sending connection context
0504 14:08:36.240772 (+   260us) client_negotiation.cc:241] Negotiation successful
0504 14:08:36.241042 (+   270us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":295,"thread_start_us":120,"threads_started":1}
I20260504 14:08:36.243284  3249 heartbeater.cc:344] Connected to a master server at 127.25.254.254:35629
I20260504 14:08:36.243585  3249 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:36.244266  3249 heartbeater.cc:507] Master 127.25.254.254:35629 requested a full tablet report, sending...
WARNING: no policy specified for kudu/127.25.254.194@KRBTEST.COM; defaulting to no policy
I20260504 14:08:36.246037  3056 ts_manager.cc:194] Registered new tserver with Master: e4a3b4212eeb488e90efe81a7c03c2d9 (127.25.254.193:41285)
I20260504 14:08:36.247262  3056 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.193@KRBTEST.COM'} at 127.25.254.193:34175
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:08:36.294251 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:0
--local_ip_for_outbound_sockets=127.25.254.194
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:35629
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:39769
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:36.400753  3256 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:36.400983  3256 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:36.401041  3256 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:36.404402  3256 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:36.404472  3256 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:36.404619  3256 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:08:36.408988  3256 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:39769
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:35629
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.3256
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:36.410130  3256 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:36.411011  3256 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:36.417542  3262 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:36.417551  3261 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:36.417614  3256 server_base.cc:1061] running on GCE node
W20260504 14:08:36.417529  3264 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:36.418234  3256 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:36.418830  3256 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:36.420037  3256 hybrid_clock.cc:648] HybridClock initialized: now 1777903716420002 us; error 50 us; skew 500 ppm
May 04 14:08:36 dist-test-slave-2x32 krb5kdc[3008](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903716, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:36.423359  3256 init.cc:377] Logged in from keytab as kudu/127.25.254.194@KRBTEST.COM (short username kudu)
I20260504 14:08:36.424597  3256 webserver.cc:492] Webserver started at http://127.25.254.194:37179/ using document root <none> and password file <none>
I20260504 14:08:36.425190  3256 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:36.425269  3256 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:36.425482  3256 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:36.427317  3256 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/data/instance:
uuid: "84b501acecd04fcc913513d254756db8"
format_stamp: "Formatted at 2026-05-04 14:08:36 on dist-test-slave-2x32"
server_key: "6e75143df480ff7dab1a4b931d74ccb5"
server_key_iv: "b759f6a2e1dc835597771cbe5c79ed44"
server_key_version: "encryptionkey@0"
I20260504 14:08:36.427836  3256 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance:
uuid: "84b501acecd04fcc913513d254756db8"
format_stamp: "Formatted at 2026-05-04 14:08:36 on dist-test-slave-2x32"
server_key: "6e75143df480ff7dab1a4b931d74ccb5"
server_key_iv: "b759f6a2e1dc835597771cbe5c79ed44"
server_key_version: "encryptionkey@0"
I20260504 14:08:36.431437  3256 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.004s	sys 0.000s
I20260504 14:08:36.433776  3271 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:36.434829  3256 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:08:36.434973  3256 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "84b501acecd04fcc913513d254756db8"
format_stamp: "Formatted at 2026-05-04 14:08:36 on dist-test-slave-2x32"
server_key: "6e75143df480ff7dab1a4b931d74ccb5"
server_key_iv: "b759f6a2e1dc835597771cbe5c79ed44"
server_key_version: "encryptionkey@0"
I20260504 14:08:36.435084  3256 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:36.454319  3256 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:36.457233  3256 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:36.457453  3256 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:36.458050  3256 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:36.458989  3256 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:36.459065  3256 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:36.459136  3256 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:36.459192  3256 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:36.469682  3256 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:42815
I20260504 14:08:36.469700  3384 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:42815 every 8 connection(s)
I20260504 14:08:36.470708  3256 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
May 04 14:08:36 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903716, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:08:36.480783 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 3256
I20260504 14:08:36.480908 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance
I20260504 14:08:36.481196 26619 external_mini_cluster.cc:1468] Setting key 445f3e17deaad557813061b9375ee69f
I20260504 14:08:36.483299  3095 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:36.472434 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:47273 (local address 127.25.254.254:35629)
0504 14:08:36.472587 (+   153us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:36.472591 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:36.473183 (+   592us) server_negotiation.cc:408] Connection header received
0504 14:08:36.473994 (+   811us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:36.473998 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:36.474069 (+    71us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:36.474219 (+   150us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:36.475873 (+  1654us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:36.476379 (+   506us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:36.477054 (+   675us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:36.477190 (+   136us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:36.479415 (+  2225us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:36.479435 (+    20us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:36.479437 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:36.479463 (+    26us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:36.481017 (+  1554us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:36.481560 (+   543us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:36.481564 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:36.481566 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:36.481620 (+    54us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:36.482037 (+   417us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:36.482039 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:36.482041 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:36.482250 (+   209us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:36.482333 (+    83us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:36.482949 (+   616us) server_negotiation.cc:300] Negotiation successful
0504 14:08:36.483081 (+   132us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":57}
I20260504 14:08:36.483790  3387 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:36.472634 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.194:47273)
0504 14:08:36.473036 (+   402us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:36.473067 (+    31us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:36.473790 (+   723us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:36.474362 (+   572us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:36.474370 (+     8us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:36.474824 (+   454us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:36.475729 (+   905us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:36.475743 (+    14us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:36.476491 (+   748us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:36.476494 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:36.476946 (+   452us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:36.476953 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:36.477109 (+   156us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:36.477728 (+   619us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:36.477748 (+    20us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:36.479242 (+  1494us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:36.481142 (+  1900us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:36.481148 (+     6us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:36.481167 (+    19us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:36.481454 (+   287us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:36.481770 (+   316us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:36.481773 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:36.481775 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:36.481932 (+   157us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:36.482360 (+   428us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:36.482366 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:36.482676 (+   310us) client_negotiation.cc:770] Sending connection context
0504 14:08:36.482879 (+   203us) client_negotiation.cc:241] Negotiation successful
0504 14:08:36.483077 (+   198us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":244,"thread_start_us":115,"threads_started":1}
I20260504 14:08:36.484987  3385 heartbeater.cc:344] Connected to a master server at 127.25.254.254:35629
I20260504 14:08:36.485219  3385 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:36.485671  3385 heartbeater.cc:507] Master 127.25.254.254:35629 requested a full tablet report, sending...
I20260504 14:08:36.486714  3056 ts_manager.cc:194] Registered new tserver with Master: 84b501acecd04fcc913513d254756db8 (127.25.254.194:42815)
I20260504 14:08:36.487284  3056 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.194@KRBTEST.COM'} at 127.25.254.194:47273
WARNING: no policy specified for kudu/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:08:36.533680 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:0
--local_ip_for_outbound_sockets=127.25.254.195
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:35629
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:39769
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:36.638514  3392 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:36.638742  3392 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:36.638844  3392 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:36.642513  3392 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:36.642585  3392 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:36.642709  3392 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:08:36.647240  3392 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:39769
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:35629
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.3392
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:36.648396  3392 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:36.649353  3392 file_cache.cc:492] Constructed file cache file cache with capacity 419430
I20260504 14:08:36.656431  3392 server_base.cc:1061] running on GCE node
W20260504 14:08:36.656397  3400 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:36.656421  3398 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:36.656621  3397 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:36.657164  3392 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:36.657750  3392 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:36.658934  3392 hybrid_clock.cc:648] HybridClock initialized: now 1777903716658906 us; error 43 us; skew 500 ppm
May 04 14:08:36 dist-test-slave-2x32 krb5kdc[3008](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903716, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:36.661530  3392 init.cc:377] Logged in from keytab as kudu/127.25.254.195@KRBTEST.COM (short username kudu)
I20260504 14:08:36.662803  3392 webserver.cc:492] Webserver started at http://127.25.254.195:33219/ using document root <none> and password file <none>
I20260504 14:08:36.663336  3392 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:36.663383  3392 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:36.663542  3392 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:36.665310  3392 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/data/instance:
uuid: "b0a7802e080e42fab5f8fad9cf8ef396"
format_stamp: "Formatted at 2026-05-04 14:08:36 on dist-test-slave-2x32"
server_key: "d39fbb1923d4dec1f291047a3621138b"
server_key_iv: "33e1b67365d8481503557544e694172b"
server_key_version: "encryptionkey@0"
I20260504 14:08:36.665756  3392 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance:
uuid: "b0a7802e080e42fab5f8fad9cf8ef396"
format_stamp: "Formatted at 2026-05-04 14:08:36 on dist-test-slave-2x32"
server_key: "d39fbb1923d4dec1f291047a3621138b"
server_key_iv: "33e1b67365d8481503557544e694172b"
server_key_version: "encryptionkey@0"
I20260504 14:08:36.669252  3392 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.003s	sys 0.003s
I20260504 14:08:36.671507  3407 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:36.672803  3392 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.001s	sys 0.000s
I20260504 14:08:36.672931  3392 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "b0a7802e080e42fab5f8fad9cf8ef396"
format_stamp: "Formatted at 2026-05-04 14:08:36 on dist-test-slave-2x32"
server_key: "d39fbb1923d4dec1f291047a3621138b"
server_key_iv: "33e1b67365d8481503557544e694172b"
server_key_version: "encryptionkey@0"
I20260504 14:08:36.673020  3392 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:36.694262  3392 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:36.697415  3392 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:36.697584  3392 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:36.698122  3392 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:36.699106  3392 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:36.699152  3392 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:36.699190  3392 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:36.699205  3392 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:36.709275  3392 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:37221
I20260504 14:08:36.709311  3520 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:37221 every 8 connection(s)
I20260504 14:08:36.710289  3392 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
May 04 14:08:36 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903716, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:08:36.719879 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 3392
I20260504 14:08:36.720039 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance
I20260504 14:08:36.720336 26619 external_mini_cluster.cc:1468] Setting key f9b5913309fef4ebd8bb2e501c0b39a1
I20260504 14:08:36.724601  3095 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:36.712132 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:38747 (local address 127.25.254.254:35629)
0504 14:08:36.712287 (+   155us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:36.712291 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:36.712946 (+   655us) server_negotiation.cc:408] Connection header received
0504 14:08:36.713861 (+   915us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:36.713865 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:36.713917 (+    52us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:36.714000 (+    83us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:36.715756 (+  1756us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:36.716318 (+   562us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:36.717088 (+   770us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:36.717252 (+   164us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:36.719617 (+  2365us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:36.719698 (+    81us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:36.719701 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:36.719733 (+    32us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:36.721900 (+  2167us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:36.722793 (+   893us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:36.722800 (+     7us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:36.722805 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:36.722878 (+    73us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:36.723399 (+   521us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:36.723405 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:36.723409 (+     4us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:36.723585 (+   176us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:36.723683 (+    98us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:36.724343 (+   660us) server_negotiation.cc:300] Negotiation successful
0504 14:08:36.724460 (+   117us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":59}
I20260504 14:08:36.725329  3523 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:36.712342 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.195:38747)
0504 14:08:36.712792 (+   450us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:36.712828 (+    36us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:36.713638 (+   810us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:36.714191 (+   553us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:36.714200 (+     9us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:36.714590 (+   390us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:36.715507 (+   917us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:36.715520 (+    13us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:36.716482 (+   962us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:36.716485 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:36.716949 (+   464us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:36.716957 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:36.717146 (+   189us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:36.717792 (+   646us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:36.717812 (+    20us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:36.719432 (+  1620us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:36.722102 (+  2670us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:36.722109 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:36.722125 (+    16us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:36.722521 (+   396us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:36.723010 (+   489us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:36.723013 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:36.723015 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:36.723236 (+   221us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:36.723768 (+   532us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:36.723774 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:36.724052 (+   278us) client_negotiation.cc:770] Sending connection context
0504 14:08:36.724247 (+   195us) client_negotiation.cc:241] Negotiation successful
0504 14:08:36.724467 (+   220us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":267,"thread_start_us":105,"threads_started":1}
I20260504 14:08:36.726732  3521 heartbeater.cc:344] Connected to a master server at 127.25.254.254:35629
I20260504 14:08:36.727018  3521 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:36.727577  3521 heartbeater.cc:507] Master 127.25.254.254:35629 requested a full tablet report, sending...
I20260504 14:08:36.728955  3056 ts_manager.cc:194] Registered new tserver with Master: b0a7802e080e42fab5f8fad9cf8ef396 (127.25.254.195:37221)
I20260504 14:08:36.729507  3056 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.195@KRBTEST.COM'} at 127.25.254.195:38747
I20260504 14:08:36.735819 26619 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
WARNING: no policy specified for oryx/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "oryx/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal oryx/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/oryx_127.25.254.193.keytab.
Entry for principal oryx/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/oryx_127.25.254.193.keytab.
WARNING: no policy specified for oryx/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "oryx/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal oryx/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/oryx_127.25.254.194.keytab.
Entry for principal oryx/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/oryx_127.25.254.194.keytab.
WARNING: no policy specified for oryx/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "oryx/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal oryx/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/oryx_127.25.254.195.keytab.
Entry for principal oryx/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/oryx_127.25.254.195.keytab.
I20260504 14:08:36.817633 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 3120
I20260504 14:08:36.823797 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 3256
I20260504 14:08:36.829771 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 3392
I20260504 14:08:36.835840 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 3024
I20260504 14:08:36.842778 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:35629
--webserver_interface=127.25.254.254
--webserver_port=33027
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:39769
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:35629
--encrypt_data_at_rest=true
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:36.947389  3530 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:36.947674  3530 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:36.947789  3530 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:36.951293  3530 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:08:36.951370  3530 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:36.951395  3530 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:36.951413  3530 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:08:36.951470  3530 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:08:36.956200  3530 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:39769
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:35629
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:35629
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=33027
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.3530
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:36.957372  3530 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:36.958329  3530 file_cache.cc:492] Constructed file cache file cache with capacity 419430
I20260504 14:08:36.964200  3530 server_base.cc:1061] running on GCE node
W20260504 14:08:36.964097  3538 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:36.964099  3535 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:36.964097  3536 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:36.965054  3530 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:36.966101  3530 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:36.967339  3530 hybrid_clock.cc:648] HybridClock initialized: now 1777903716967308 us; error 48 us; skew 500 ppm
May 04 14:08:36 dist-test-slave-2x32 krb5kdc[3008](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903716, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:36.970460  3530 init.cc:377] Logged in from keytab as kudu/127.25.254.254@KRBTEST.COM (short username kudu)
I20260504 14:08:36.971701  3530 webserver.cc:492] Webserver started at http://127.25.254.254:33027/ using document root <none> and password file <none>
I20260504 14:08:36.972316  3530 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:36.972414  3530 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:36.976248  3530 fs_manager.cc:714] Time spent opening directory manager: real 0.003s	user 0.004s	sys 0.000s
I20260504 14:08:36.978376  3545 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:36.979444  3530 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:08:36.979544  3530 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "9f298813135f4a2688894275d165d989"
format_stamp: "Formatted at 2026-05-04 14:08:35 on dist-test-slave-2x32"
server_key: "9a8f0760b5aa7f28a1cdb90cc648e356"
server_key_iv: "ece438222775b74fb1052ff00b47e2f0"
server_key_version: "encryptionkey@0"
I20260504 14:08:36.979979  3530 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:37.005146  3530 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:37.008083  3530 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:37.008246  3530 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:37.016266  3530 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:35629
I20260504 14:08:37.016283  3597 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:35629 every 8 connection(s)
I20260504 14:08:37.017342  3530 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:08:37.019016 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 3530
I20260504 14:08:37.019533 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:41285
--local_ip_for_outbound_sockets=127.25.254.193
--tserver_master_addrs=127.25.254.254:35629
--webserver_port=38599
--webserver_interface=127.25.254.193
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:39769
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--principal=oryx/127.25.254.193
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/oryx_127.25.254.193.keytab with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
I20260504 14:08:37.022614  3598 sys_catalog.cc:263] Verifying existing consensus state
I20260504 14:08:37.024155  3598 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989: Bootstrap starting.
I20260504 14:08:37.040630  3598 log.cc:826] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:37.046924  3598 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989: Bootstrap replayed 1/1 log segments. Stats: ops{read=4 overwritten=0 applied=4 ignored=0} inserts{seen=3 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260504 14:08:37.047318  3598 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989: Bootstrap complete.
I20260504 14:08:37.050793  3598 raft_consensus.cc:359] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9f298813135f4a2688894275d165d989" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 35629 } }
I20260504 14:08:37.051092  3598 raft_consensus.cc:740] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 9f298813135f4a2688894275d165d989, State: Initialized, Role: FOLLOWER
I20260504 14:08:37.051533  3598 consensus_queue.cc:260] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 4, Last appended: 1.4, Last appended by leader: 4, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9f298813135f4a2688894275d165d989" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 35629 } }
I20260504 14:08:37.051645  3598 raft_consensus.cc:399] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [term 1 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:08:37.051730  3598 raft_consensus.cc:493] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [term 1 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:08:37.051819  3598 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [term 1 FOLLOWER]: Advancing to term 2
I20260504 14:08:37.053369  3598 raft_consensus.cc:515] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9f298813135f4a2688894275d165d989" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 35629 } }
I20260504 14:08:37.053689  3598 leader_election.cc:304] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 9f298813135f4a2688894275d165d989; no voters: 
I20260504 14:08:37.054016  3598 leader_election.cc:290] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [CANDIDATE]: Term 2 election: Requested vote from peers 
I20260504 14:08:37.054093  3602 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [term 2 FOLLOWER]: Leader election won for term 2
I20260504 14:08:37.054363  3602 raft_consensus.cc:697] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [term 2 LEADER]: Becoming Leader. State: Replica: 9f298813135f4a2688894275d165d989, State: Running, Role: LEADER
I20260504 14:08:37.054694  3602 consensus_queue.cc:237] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 4, Committed index: 4, Last appended: 1.4, Last appended by leader: 4, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9f298813135f4a2688894275d165d989" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 35629 } }
I20260504 14:08:37.055145  3598 sys_catalog.cc:565] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:08:37.055994  3603 sys_catalog.cc:455] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 2 leader_uuid: "9f298813135f4a2688894275d165d989" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9f298813135f4a2688894275d165d989" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 35629 } } }
I20260504 14:08:37.056119  3603 sys_catalog.cc:458] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:37.056413  3604 sys_catalog.cc:455] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 9f298813135f4a2688894275d165d989. Latest consensus state: current_term: 2 leader_uuid: "9f298813135f4a2688894275d165d989" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9f298813135f4a2688894275d165d989" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 35629 } } }
I20260504 14:08:37.056481  3604 sys_catalog.cc:458] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989 [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:37.056891  3611 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:08:37.060045  3611 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:08:37.060987  3611 catalog_manager.cc:1269] Loaded cluster ID: 82c47af7a952427287680c0c13d8c055
I20260504 14:08:37.061095  3611 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:08:37.063788  3611 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:08:37.064481  3611 catalog_manager.cc:6055] T 00000000000000000000000000000000 P 9f298813135f4a2688894275d165d989: Loaded TSK: 0
I20260504 14:08:37.065562  3611 catalog_manager.cc:1524] Initializing in-progress tserver states...
W20260504 14:08:37.128042  3600 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:37.128321  3600 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:37.128412  3600 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:37.132019  3600 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:37.132144  3600 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:37.132279  3600 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:08:37.136832  3600 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:39769
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/oryx_127.25.254.193.keytab
--principal=oryx/127.25.254.193
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:41285
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=38599
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:35629
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.3600
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:37.138031  3600 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:37.138924  3600 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:37.145236  3624 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:37.145292  3600 server_base.cc:1061] running on GCE node
W20260504 14:08:37.145236  3627 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:37.145471  3625 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:37.145819  3600 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:37.146461  3600 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:37.147657  3600 hybrid_clock.cc:648] HybridClock initialized: now 1777903717147628 us; error 59 us; skew 500 ppm
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903717, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.193@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:37.150414  3600 init.cc:377] Logged in from keytab as oryx/127.25.254.193@KRBTEST.COM (short username oryx)
I20260504 14:08:37.152055  3600 webserver.cc:492] Webserver started at http://127.25.254.193:38599/ using document root <none> and password file <none>
I20260504 14:08:37.152696  3600 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:37.152768  3600 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:37.156832  3600 fs_manager.cc:714] Time spent opening directory manager: real 0.003s	user 0.004s	sys 0.000s
I20260504 14:08:37.158998  3634 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:37.160219  3600 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:08:37.160380  3600 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "e4a3b4212eeb488e90efe81a7c03c2d9"
format_stamp: "Formatted at 2026-05-04 14:08:36 on dist-test-slave-2x32"
server_key: "2784d45425de1193a480887460940b5a"
server_key_iv: "5c2d25463974902439480af0d7ee35bf"
server_key_version: "encryptionkey@0"
I20260504 14:08:37.160884  3600 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:37.188207  3600 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:37.191210  3600 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:37.191432  3600 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:37.192042  3600 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:37.193102  3600 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:37.193173  3600 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:37.193243  3600 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:37.193274  3600 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:37.204852  3600 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:41285
I20260504 14:08:37.204869  3747 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:41285 every 8 connection(s)
I20260504 14:08:37.205928  3600 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:37.216243 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 3600
I20260504 14:08:37.216852 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:42815
--local_ip_for_outbound_sockets=127.25.254.194
--tserver_master_addrs=127.25.254.254:35629
--webserver_port=37179
--webserver_interface=127.25.254.194
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:39769
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--principal=oryx/127.25.254.194
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/oryx_127.25.254.194.keytab with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:37.218833  3751 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:37.208309 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.193:59447)
0504 14:08:37.208879 (+   570us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:37.208922 (+    43us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:37.209709 (+   787us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:37.210869 (+  1160us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:37.210877 (+     8us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:37.211241 (+   364us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:37.212117 (+   876us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:37.212132 (+    15us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.213750 (+  1618us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:37.213754 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:37.214323 (+   569us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:37.214332 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.214517 (+   185us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:37.215174 (+   657us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:37.215195 (+    21us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:37.217967 (+  2772us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":386,"thread_start_us":140,"threads_started":1}
W20260504 14:08:37.219039  3750 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:37.208297 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:59447 (local address 127.25.254.254:35629)
0504 14:08:37.208773 (+   476us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:37.208785 (+    12us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:37.209047 (+   262us) server_negotiation.cc:408] Connection header received
0504 14:08:37.210111 (+  1064us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:37.210125 (+    14us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:37.210472 (+   347us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:37.210785 (+   313us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:37.212304 (+  1519us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.213579 (+  1275us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:37.214449 (+   870us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.214718 (+   269us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:37.218299 (+  3581us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.193:59447: BlockingRecv error: recv got EOF from 127.25.254.193:59447 (error 108)
Metrics: {"server-negotiator.queue_time_us":313,"thread_start_us":131,"threads_started":1}
W20260504 14:08:37.219476  3748 heartbeater.cc:646] Failed to heartbeat to 127.25.254.254:35629 (0 consecutive failures): Not authorized: Failed to ping master at 127.25.254.254:35629: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:37.226286  3751 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:37.220057 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.193:34075)
0504 14:08:37.220239 (+   182us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:37.220254 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:37.220337 (+    83us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:37.220656 (+   319us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:37.220660 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:37.220854 (+   194us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:37.221115 (+   261us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:37.221122 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.222371 (+  1249us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:37.222376 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:37.222976 (+   600us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:37.222985 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.223099 (+   114us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:37.223680 (+   581us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:37.223704 (+    24us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:37.226089 (+  2385us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":91}
W20260504 14:08:37.226446  3750 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:37.220210 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:34075 (local address 127.25.254.254:35629)
0504 14:08:37.220368 (+   158us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:37.220374 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:37.220394 (+    20us) server_negotiation.cc:408] Connection header received
0504 14:08:37.220439 (+    45us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:37.220445 (+     6us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:37.220504 (+    59us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:37.220611 (+   107us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:37.221261 (+   650us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.222239 (+   978us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:37.223114 (+   875us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.223757 (+   643us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:37.226283 (+  2526us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.193:34075: BlockingRecv error: recv got EOF from 127.25.254.193:34075 (error 108)
Metrics: {"server-negotiator.queue_time_us":49}
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:37.232651  3751 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:37.227213 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.193:37401)
0504 14:08:37.227408 (+   195us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:37.227422 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:37.227500 (+    78us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:37.227810 (+   310us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:37.227813 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:37.227996 (+   183us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:37.228208 (+   212us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:37.228214 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.229244 (+  1030us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:37.229248 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:37.229807 (+   559us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:37.229814 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.229928 (+   114us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:37.230475 (+   547us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:37.230492 (+    17us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:37.232520 (+  2028us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":111}
W20260504 14:08:37.232795  3750 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:37.227279 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:37401 (local address 127.25.254.254:35629)
0504 14:08:37.227441 (+   162us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:37.227447 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:37.227465 (+    18us) server_negotiation.cc:408] Connection header received
0504 14:08:37.227600 (+   135us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:37.227606 (+     6us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:37.227686 (+    80us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:37.227776 (+    90us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:37.228337 (+   561us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.229120 (+   783us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:37.229935 (+   815us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.230561 (+   626us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:37.232679 (+  2118us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.193:37401: BlockingRecv error: recv got EOF from 127.25.254.193:37401 (error 108)
Metrics: {"server-negotiator.queue_time_us":60}
W20260504 14:08:37.233105  3748 heartbeater.cc:412] Failed 3 heartbeats in a row: no longer allowing fast heartbeat attempts.
W20260504 14:08:37.333420  3752 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:37.333768  3752 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:37.333868  3752 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:37.337598  3752 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:37.337671  3752 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:37.337756  3752 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:08:37.342334  3752 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:39769
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/oryx_127.25.254.194.keytab
--principal=oryx/127.25.254.194
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:42815
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=37179
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:35629
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.3752
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:37.343565  3752 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:37.344455  3752 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:37.351200  3760 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:37.351204  3758 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:37.351264  3757 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:37.351636  3752 server_base.cc:1061] running on GCE node
I20260504 14:08:37.352383  3752 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:37.353032  3752 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:37.354267  3752 hybrid_clock.cc:648] HybridClock initialized: now 1777903717354254 us; error 59 us; skew 500 ppm
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903717, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.194@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:37.357282  3752 init.cc:377] Logged in from keytab as oryx/127.25.254.194@KRBTEST.COM (short username oryx)
I20260504 14:08:37.358433  3752 webserver.cc:492] Webserver started at http://127.25.254.194:37179/ using document root <none> and password file <none>
I20260504 14:08:37.358978  3752 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:37.359037  3752 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:37.362653  3752 fs_manager.cc:714] Time spent opening directory manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:08:37.365453  3767 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:37.366794  3752 fs_manager.cc:730] Time spent opening block manager: real 0.003s	user 0.003s	sys 0.000s
I20260504 14:08:37.366928  3752 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "84b501acecd04fcc913513d254756db8"
format_stamp: "Formatted at 2026-05-04 14:08:36 on dist-test-slave-2x32"
server_key: "6e75143df480ff7dab1a4b931d74ccb5"
server_key_iv: "b759f6a2e1dc835597771cbe5c79ed44"
server_key_version: "encryptionkey@0"
I20260504 14:08:37.367372  3752 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:37.401161  3752 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:37.404317  3752 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:37.404526  3752 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:37.405162  3752 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:37.406101  3752 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:37.406173  3752 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:37.406266  3752 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:37.406286  3752 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:37.416090  3752 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:42815
I20260504 14:08:37.416105  3880 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:42815 every 8 connection(s)
I20260504 14:08:37.417145  3752 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
I20260504 14:08:37.424686 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 3752
I20260504 14:08:37.425227 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:37221
--local_ip_for_outbound_sockets=127.25.254.195
--tserver_master_addrs=127.25.254.254:35629
--webserver_port=33219
--webserver_interface=127.25.254.195
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:39769
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--principal=oryx/127.25.254.195
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/oryx_127.25.254.195.keytab with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
W20260504 14:08:37.428124  3750 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:37.419146 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:43503 (local address 127.25.254.254:35629)
0504 14:08:37.419337 (+   191us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:37.419341 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:37.420044 (+   703us) server_negotiation.cc:408] Connection header received
0504 14:08:37.420944 (+   900us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:37.420948 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:37.421011 (+    63us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:37.421121 (+   110us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:37.422823 (+  1702us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.423321 (+   498us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:37.424011 (+   690us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.424182 (+   171us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:37.427993 (+  3811us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.194:43503: BlockingRecv error: recv got EOF from 127.25.254.194:43503 (error 108)
Metrics: {"server-negotiator.queue_time_us":94}
I20260504 14:08:37.428644  3883 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:37.419383 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.194:43503)
0504 14:08:37.419859 (+   476us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:37.419894 (+    35us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:37.420727 (+   833us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:37.421299 (+   572us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:37.421309 (+    10us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:37.421686 (+   377us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:37.422657 (+   971us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:37.422672 (+    15us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.423453 (+   781us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:37.423464 (+    11us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:37.423851 (+   387us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:37.423857 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.424096 (+   239us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:37.424915 (+   819us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:37.424942 (+    27us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:37.427841 (+  2899us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":298,"thread_start_us":119,"threads_started":1}
W20260504 14:08:37.429179  3881 heartbeater.cc:646] Failed to heartbeat to 127.25.254.254:35629 (0 consecutive failures): Not authorized: Failed to ping master at 127.25.254.254:35629: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:37.435442  3883 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:37.429688 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.194:40019)
0504 14:08:37.429918 (+   230us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:37.429933 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:37.430042 (+   109us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:37.430374 (+   332us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:37.430381 (+     7us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:37.430596 (+   215us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:37.430892 (+   296us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:37.430901 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.431646 (+   745us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:37.431651 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:37.432334 (+   683us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:37.432347 (+    13us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.432480 (+   133us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:37.432982 (+   502us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:37.433004 (+    22us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:37.435291 (+  2287us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":130}
W20260504 14:08:37.435598  3750 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:37.429773 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:40019 (local address 127.25.254.254:35629)
0504 14:08:37.429884 (+   111us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:37.429887 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:37.430003 (+   116us) server_negotiation.cc:408] Connection header received
0504 14:08:37.430143 (+   140us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:37.430146 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:37.430250 (+   104us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:37.430322 (+    72us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:37.431038 (+   716us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.431517 (+   479us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:37.432467 (+   950us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.432601 (+   134us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:37.435436 (+  2835us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.194:40019: BlockingRecv error: recv got EOF from 127.25.254.194:40019 (error 108)
Metrics: {"server-negotiator.queue_time_us":30}
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:37.441382  3883 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:37.436243 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.194:49029)
0504 14:08:37.436439 (+   196us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:37.436453 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:37.436517 (+    64us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:37.436870 (+   353us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:37.436875 (+     5us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:37.437047 (+   172us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:37.437285 (+   238us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:37.437296 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.438081 (+   785us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:37.438085 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:37.438691 (+   606us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:37.438712 (+    21us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.438837 (+   125us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:37.439413 (+   576us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:37.439427 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:37.441240 (+  1813us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":121}
W20260504 14:08:37.441588  3750 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:37.436341 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:49029 (local address 127.25.254.254:35629)
0504 14:08:37.436465 (+   124us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:37.436468 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:37.436499 (+    31us) server_negotiation.cc:408] Connection header received
0504 14:08:37.436645 (+   146us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:37.436697 (+    52us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:37.436744 (+    47us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:37.436819 (+    75us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:37.437433 (+   614us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.437958 (+   525us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:37.438855 (+   897us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.439087 (+   232us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:37.441419 (+  2332us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.194:49029: BlockingRecv error: recv got EOF from 127.25.254.194:49029 (error 108)
Metrics: {"server-negotiator.queue_time_us":45}
W20260504 14:08:37.441722  3881 heartbeater.cc:412] Failed 3 heartbeats in a row: no longer allowing fast heartbeat attempts.
W20260504 14:08:37.535470  3884 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:37.535756  3884 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:37.535851  3884 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:37.539328  3884 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:37.539450  3884 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:37.539572  3884 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:08:37.544001  3884 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:39769
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/krb5kdc/oryx_127.25.254.195.keytab
--principal=oryx/127.25.254.195
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:37221
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=33219
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:35629
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.3884
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:37.545203  3884 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:37.546096  3884 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:37.553221  3890 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:37.553206  3892 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:37.553180  3889 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:37.553645  3884 server_base.cc:1061] running on GCE node
I20260504 14:08:37.554107  3884 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:37.554741  3884 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:37.555956  3884 hybrid_clock.cc:648] HybridClock initialized: now 1777903717555908 us; error 40 us; skew 500 ppm
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903717, etypes {rep=17 tkt=17 ses=17}, oryx/127.25.254.195@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:37.559141  3884 init.cc:377] Logged in from keytab as oryx/127.25.254.195@KRBTEST.COM (short username oryx)
I20260504 14:08:37.560190  3884 webserver.cc:492] Webserver started at http://127.25.254.195:33219/ using document root <none> and password file <none>
I20260504 14:08:37.560722  3884 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:37.560782  3884 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:37.564407  3884 fs_manager.cc:714] Time spent opening directory manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:08:37.566460  3899 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:37.567644  3884 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.001s
I20260504 14:08:37.567762  3884 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "b0a7802e080e42fab5f8fad9cf8ef396"
format_stamp: "Formatted at 2026-05-04 14:08:36 on dist-test-slave-2x32"
server_key: "d39fbb1923d4dec1f291047a3621138b"
server_key_iv: "33e1b67365d8481503557544e694172b"
server_key_version: "encryptionkey@0"
I20260504 14:08:37.568200  3884 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:37.586784  3884 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:37.589488  3884 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:37.589640  3884 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:37.590255  3884 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:37.591164  3884 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:37.591213  3884 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:37.591254  3884 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:37.591269  3884 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:37.600675  3884 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:37221
I20260504 14:08:37.600704  4012 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:37221 every 8 connection(s)
I20260504 14:08:37.601763  3884 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestMismatchingPrincipals.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
I20260504 14:08:37.602471 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 3884
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:37.612731  3750 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:37.603269 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:42220 (local address 127.25.254.254:35629)
0504 14:08:37.603478 (+   209us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:37.603483 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:37.603663 (+   180us) server_negotiation.cc:408] Connection header received
0504 14:08:37.603909 (+   246us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:37.603912 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:37.604012 (+   100us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:37.604102 (+    90us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:37.605333 (+  1231us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.606142 (+   809us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:37.606888 (+   746us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.607089 (+   201us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:37.608275 (+  1186us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:37.608295 (+    20us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:37.608306 (+    11us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:37.608334 (+    28us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:37.610595 (+  2261us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:37.611127 (+   532us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:37.611132 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:37.611137 (+     5us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:37.611212 (+    75us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:37.611505 (+   293us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:37.611508 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:37.611510 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:37.611846 (+   336us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:37.612067 (+   221us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:37.612364 (+   297us) server_negotiation.cc:300] Negotiation successful
0504 14:08:37.612600 (+   236us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":63}
W20260504 14:08:37.613296  4016 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:37.604018 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:44275 (local address 127.25.254.254:35629)
0504 14:08:37.604278 (+   260us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:37.604281 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:37.604975 (+   694us) server_negotiation.cc:408] Connection header received
0504 14:08:37.606016 (+  1041us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:37.606019 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:37.606061 (+    42us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:37.606143 (+    82us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:37.607969 (+  1826us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.608618 (+   649us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:37.609629 (+  1011us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.609783 (+   154us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:37.613186 (+  3403us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.195:44275: BlockingRecv error: recv got EOF from 127.25.254.195:44275 (error 108)
Metrics: {"server-negotiator.queue_time_us":208,"thread_start_us":66,"threads_started":1}
I20260504 14:08:37.613739  4017 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:37.604261 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.195:44275)
0504 14:08:37.604821 (+   560us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:37.604855 (+    34us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:37.605786 (+   931us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:37.606362 (+   576us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:37.606370 (+     8us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:37.606763 (+   393us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:37.607811 (+  1048us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:37.607825 (+    14us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.608823 (+   998us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:37.608828 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:37.609477 (+   649us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:37.609489 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.609815 (+   326us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:37.610447 (+   632us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:37.610466 (+    19us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:37.613058 (+  2592us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":305,"thread_start_us":140,"threads_started":1}
W20260504 14:08:37.614279  4013 heartbeater.cc:646] Failed to heartbeat to 127.25.254.254:35629 (0 consecutive failures): Not authorized: Failed to ping master at 127.25.254.254:35629: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:37.619197  4017 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:37.614745 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.195:55105)
0504 14:08:37.614960 (+   215us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:37.614976 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:37.615054 (+    78us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:37.615261 (+   207us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:37.615263 (+     2us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:37.615388 (+   125us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:37.615624 (+   236us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:37.615631 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.616348 (+   717us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:37.616351 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:37.616733 (+   382us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:37.616740 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.616831 (+    91us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:37.617314 (+   483us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:37.617327 (+    13us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:37.619055 (+  1728us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":128}
W20260504 14:08:37.619300  4016 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:37.614920 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:55105 (local address 127.25.254.254:35629)
0504 14:08:37.615053 (+   133us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:37.615057 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:37.615070 (+    13us) server_negotiation.cc:408] Connection header received
0504 14:08:37.615111 (+    41us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:37.615114 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:37.615158 (+    44us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:37.615240 (+    82us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:37.615765 (+   525us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.616243 (+   478us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:37.616840 (+   597us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.617005 (+   165us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:37.619165 (+  2160us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.195:55105: BlockingRecv error: recv got EOF from 127.25.254.195:55105 (error 108)
Metrics: {"server-negotiator.queue_time_us":55}
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:37 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:37.624173  4017 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:37.619794 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.195:41413)
0504 14:08:37.619993 (+   199us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:37.620008 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:37.620108 (+   100us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:37.620382 (+   274us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:37.620385 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:37.620538 (+   153us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:37.620680 (+   142us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:37.620685 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.621369 (+   684us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:37.621374 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:37.621747 (+   373us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:37.621754 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.621877 (+   123us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:37.622414 (+   537us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:37.622427 (+    13us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:37.624034 (+  1607us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":55}
W20260504 14:08:37.624241  4016 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:37.620042 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:41413 (local address 127.25.254.254:35629)
0504 14:08:37.620206 (+   164us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:37.620210 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:37.620220 (+    10us) server_negotiation.cc:408] Connection header received
0504 14:08:37.620247 (+    27us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:37.620250 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:37.620282 (+    32us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:37.620367 (+    85us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:37.620794 (+   427us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.621264 (+   470us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:37.621866 (+   602us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:37.622073 (+   207us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:37.624137 (+  2064us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.195:41413: BlockingRecv error: recv got EOF from 127.25.254.195:41413 (error 108)
Metrics: {"server-negotiator.queue_time_us":71}
W20260504 14:08:37.624418  4013 heartbeater.cc:412] Failed 3 heartbeats in a row: no longer allowing fast heartbeat attempts.
May 04 14:08:38 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:38 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:38.239228  4018 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:38.233778 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.193:44539)
0504 14:08:38.234107 (+   329us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:38.234122 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:38.234274 (+   152us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:38.234546 (+   272us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:38.234549 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:38.234775 (+   226us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:38.235006 (+   231us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:38.235012 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:38.235857 (+   845us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:38.235860 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:38.236293 (+   433us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:38.236301 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:38.236408 (+   107us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:38.237025 (+   617us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:38.237040 (+    15us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:38.239102 (+  2062us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":230,"thread_start_us":93,"threads_started":1}
W20260504 14:08:38.239424  4019 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:38.233870 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:44539 (local address 127.25.254.254:35629)
0504 14:08:38.234241 (+   371us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:38.234245 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:38.234259 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:08:38.234369 (+   110us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:38.234372 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:38.234418 (+    46us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:38.234497 (+    79us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:38.235180 (+   683us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:38.235717 (+   537us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:38.236433 (+   716us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:38.236614 (+   181us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:38.239243 (+  2629us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.193:44539: BlockingRecv error: recv got EOF from 127.25.254.193:44539 (error 108)
Metrics: {"server-negotiator.queue_time_us":238,"thread_start_us":97,"threads_started":1}
May 04 14:08:38 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:38 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:38.447856  4020 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:38.442553 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.194:50415)
0504 14:08:38.442857 (+   304us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:38.442870 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:38.442982 (+   112us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:38.443274 (+   292us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:38.443277 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:38.443453 (+   176us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:38.443746 (+   293us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:38.443753 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:38.444518 (+   765us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:38.444521 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:38.444970 (+   449us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:38.444978 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:38.445077 (+    99us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:38.445683 (+   606us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:38.445703 (+    20us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:38.447678 (+  1975us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":228,"thread_start_us":81,"threads_started":1}
W20260504 14:08:38.448033  4019 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:38.442692 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:50415 (local address 127.25.254.254:35629)
0504 14:08:38.442876 (+   184us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:38.442879 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:38.442943 (+    64us) server_negotiation.cc:408] Connection header received
0504 14:08:38.443077 (+   134us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:38.443080 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:38.443129 (+    49us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:38.443199 (+    70us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:38.443875 (+   676us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:38.444398 (+   523us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:38.445100 (+   702us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:38.445259 (+   159us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:38.447914 (+  2655us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.194:50415: BlockingRecv error: recv got EOF from 127.25.254.194:50415 (error 108)
Metrics: {"server-negotiator.queue_time_us":100}
May 04 14:08:38 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:38 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:38.630661  4021 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:38.625130 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.195:37719)
0504 14:08:38.625410 (+   280us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:38.625425 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:38.625529 (+   104us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:38.625886 (+   357us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:38.625890 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:38.626078 (+   188us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:38.626345 (+   267us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:38.626351 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:38.627195 (+   844us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:38.627198 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:38.627595 (+   397us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:38.627602 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:38.627760 (+   158us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:38.628403 (+   643us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:38.628417 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:38.630483 (+  2066us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":186,"thread_start_us":76,"threads_started":1}
W20260504 14:08:38.630811  4019 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:38.625279 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:37719 (local address 127.25.254.254:35629)
0504 14:08:38.625475 (+   196us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:38.625479 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:38.625495 (+    16us) server_negotiation.cc:408] Connection header received
0504 14:08:38.625625 (+   130us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:38.625628 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:38.625674 (+    46us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:38.625804 (+   130us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:38.626508 (+   704us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:38.627062 (+   554us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:38.627705 (+   643us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:38.627866 (+   161us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:38.630663 (+  2797us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.195:37719: BlockingRecv error: recv got EOF from 127.25.254.195:37719 (error 108)
Metrics: {"server-negotiator.queue_time_us":95}
May 04 14:08:39 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:39 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:39.245277  4022 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:39.240113 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.193:59439)
0504 14:08:39.240384 (+   271us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:39.240398 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:39.240490 (+    92us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:39.240794 (+   304us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:39.240797 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:39.241010 (+   213us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:39.241203 (+   193us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:39.241208 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:39.242013 (+   805us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:39.242017 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:39.242508 (+   491us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:39.242517 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:39.242622 (+   105us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:39.243218 (+   596us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:39.243232 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:39.245151 (+  1919us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":181,"thread_start_us":91,"threads_started":1}
W20260504 14:08:39.245483  4023 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:39.240195 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:59439 (local address 127.25.254.254:35629)
0504 14:08:39.240467 (+   272us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:39.240470 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:39.240494 (+    24us) server_negotiation.cc:408] Connection header received
0504 14:08:39.240603 (+   109us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:39.240606 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:39.240657 (+    51us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:39.240752 (+    95us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:39.241346 (+   594us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:39.241865 (+   519us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:39.242646 (+   781us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:39.242845 (+   199us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:39.245295 (+  2450us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.193:59439: BlockingRecv error: recv got EOF from 127.25.254.193:59439 (error 108)
Metrics: {"server-negotiator.queue_time_us":198,"thread_start_us":115,"threads_started":1}
May 04 14:08:39 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:39 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:39.454241  4024 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:39.448981 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.194:35215)
0504 14:08:39.449302 (+   321us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:39.449317 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:39.449416 (+    99us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:39.449815 (+   399us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:39.449819 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:39.450054 (+   235us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:39.450319 (+   265us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:39.450326 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:39.451066 (+   740us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:39.451070 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:39.451446 (+   376us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:39.451452 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:39.451578 (+   126us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:39.452134 (+   556us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:39.452148 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:39.454015 (+  1867us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":222,"thread_start_us":105,"threads_started":1}
W20260504 14:08:39.454258  4023 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:39.449103 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:35215 (local address 127.25.254.254:35629)
0504 14:08:39.449362 (+   259us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:39.449365 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:39.449384 (+    19us) server_negotiation.cc:408] Connection header received
0504 14:08:39.449569 (+   185us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:39.449573 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:39.449653 (+    80us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:39.449786 (+   133us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:39.450445 (+   659us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:39.450944 (+   499us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:39.451560 (+   616us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:39.451840 (+   280us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:39.454127 (+  2287us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.194:35215: BlockingRecv error: recv got EOF from 127.25.254.194:35215 (error 108)
Metrics: {"server-negotiator.queue_time_us":159}
May 04 14:08:39 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:39 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:39.637594  4025 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:39.631789 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.195:60543)
0504 14:08:39.632158 (+   369us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:39.632176 (+    18us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:39.632292 (+   116us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:39.632696 (+   404us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:39.632699 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:39.632998 (+   299us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:39.633197 (+   199us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:39.633203 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:39.634036 (+   833us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:39.634039 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:39.634488 (+   449us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:39.634495 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:39.634596 (+   101us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:39.635172 (+   576us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:39.635185 (+    13us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:39.637399 (+  2214us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":265,"thread_start_us":111,"threads_started":1}
W20260504 14:08:39.637642  4023 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:39.632074 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:60543 (local address 127.25.254.254:35629)
0504 14:08:39.632232 (+   158us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:39.632236 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:39.632251 (+    15us) server_negotiation.cc:408] Connection header received
0504 14:08:39.632478 (+   227us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:39.632482 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:39.632544 (+    62us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:39.632661 (+   117us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:39.633342 (+   681us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:39.633922 (+   580us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:39.634609 (+   687us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:39.634815 (+   206us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:39.637526 (+  2711us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.195:60543: BlockingRecv error: recv got EOF from 127.25.254.195:60543 (error 108)
Metrics: {"server-negotiator.queue_time_us":59}
May 04 14:08:40 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:40 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:40.252449  4026 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:40.246418 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.193:55947)
0504 14:08:40.246751 (+   333us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:40.246769 (+    18us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:40.246880 (+   111us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:40.247325 (+   445us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:40.247328 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:40.247557 (+   229us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:40.247872 (+   315us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:40.247881 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:40.248944 (+  1063us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:40.248947 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:40.249343 (+   396us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:40.249349 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:40.249464 (+   115us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:40.250188 (+   724us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:40.250202 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:40.252272 (+  2070us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":219,"thread_start_us":116,"threads_started":1}
W20260504 14:08:40.252524  4027 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:40.246520 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:55947 (local address 127.25.254.254:35629)
0504 14:08:40.247014 (+   494us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:40.247018 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:40.247037 (+    19us) server_negotiation.cc:408] Connection header received
0504 14:08:40.247090 (+    53us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:40.247095 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:40.247158 (+    63us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:40.247250 (+    92us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:40.248026 (+   776us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:40.248794 (+   768us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:40.249501 (+   707us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:40.249653 (+   152us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:40.252401 (+  2748us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.193:55947: BlockingRecv error: recv got EOF from 127.25.254.193:55947 (error 108)
Metrics: {"server-negotiator.queue_time_us":412,"thread_start_us":276,"threads_started":1}
May 04 14:08:40 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:40 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:40.460808  4028 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:40.455186 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.194:52113)
0504 14:08:40.455547 (+   361us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:40.455563 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:40.455715 (+   152us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:40.455999 (+   284us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:40.456003 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:40.456240 (+   237us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:40.456528 (+   288us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:40.456541 (+    13us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:40.457467 (+   926us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:40.457472 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:40.457869 (+   397us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:40.457876 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:40.458010 (+   134us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:40.458709 (+   699us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:40.458725 (+    16us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:40.460616 (+  1891us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":273,"thread_start_us":115,"threads_started":1}
W20260504 14:08:40.460865  4027 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:40.455372 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:52113 (local address 127.25.254.254:35629)
0504 14:08:40.455521 (+   149us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:40.455525 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:40.455660 (+   135us) server_negotiation.cc:408] Connection header received
0504 14:08:40.455815 (+   155us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:40.455818 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:40.455883 (+    65us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:40.455978 (+    95us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:40.456657 (+   679us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:40.457342 (+   685us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:40.457992 (+   650us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:40.458215 (+   223us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:40.460746 (+  2531us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.194:52113: BlockingRecv error: recv got EOF from 127.25.254.194:52113 (error 108)
Metrics: {"server-negotiator.queue_time_us":47}
May 04 14:08:40 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:40 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:40.648792  4029 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:40.638668 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.195:49437)
0504 14:08:40.639128 (+   460us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:40.639147 (+    19us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:40.639255 (+   108us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:40.639550 (+   295us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:40.639553 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:40.639859 (+   306us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:40.640145 (+   286us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:40.640153 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:40.641119 (+   966us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:40.641123 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:40.641532 (+   409us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:40.641539 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:40.641653 (+   114us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:40.646396 (+  4743us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:40.646415 (+    19us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:40.648655 (+  2240us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":349,"thread_start_us":215,"threads_started":1}
W20260504 14:08:40.649092  4027 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:40.638968 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:49437 (local address 127.25.254.254:35629)
0504 14:08:40.639165 (+   197us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:40.639169 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:40.639207 (+    38us) server_negotiation.cc:408] Connection header received
0504 14:08:40.639356 (+   149us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:40.639362 (+     6us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:40.639414 (+    52us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:40.639492 (+    78us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:40.640276 (+   784us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:40.640965 (+   689us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:40.641700 (+   735us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:40.641907 (+   207us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:40.648960 (+  7053us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.195:49437: BlockingRecv error: recv got EOF from 127.25.254.195:49437 (error 108)
Metrics: {"server-negotiator.queue_time_us":78}
May 04 14:08:41 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:41 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:41.264418  4030 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:41.253430 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.193:44613)
0504 14:08:41.253879 (+   449us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:41.253896 (+    17us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:41.254014 (+   118us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:41.254399 (+   385us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:41.254402 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:41.254587 (+   185us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:41.254930 (+   343us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:41.254940 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:41.255877 (+   937us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:41.255919 (+    42us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:41.256374 (+   455us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:41.256382 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:41.256518 (+   136us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:41.262213 (+  5695us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:41.262231 (+    18us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:41.264244 (+  2013us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":364,"thread_start_us":97,"threads_started":1}
W20260504 14:08:41.264528  4031 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:41.253591 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:44613 (local address 127.25.254.254:35629)
0504 14:08:41.253875 (+   284us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:41.253879 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:41.253984 (+   105us) server_negotiation.cc:408] Connection header received
0504 14:08:41.254140 (+   156us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:41.254144 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:41.254272 (+   128us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:41.254359 (+    87us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:41.255063 (+   704us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:41.255727 (+   664us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:41.256501 (+   774us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:41.256717 (+   216us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:41.264389 (+  7672us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.193:44613: BlockingRecv error: recv got EOF from 127.25.254.193:44613 (error 108)
Metrics: {"server-negotiator.queue_time_us":184,"thread_start_us":62,"threads_started":1}
May 04 14:08:41 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:41 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:41.466691  4032 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:41.461730 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.194:36357)
0504 14:08:41.462034 (+   304us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:41.462047 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:41.462189 (+   142us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:41.462478 (+   289us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:41.462481 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:41.462658 (+   177us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:41.462850 (+   192us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:41.462855 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:41.463584 (+   729us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:41.463587 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:41.464011 (+   424us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:41.464018 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:41.464113 (+    95us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:41.464718 (+   605us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:41.464734 (+    16us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:41.466557 (+  1823us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":210,"thread_start_us":92,"threads_started":1}
W20260504 14:08:41.466825  4031 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:41.461817 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:36357 (local address 127.25.254.254:35629)
0504 14:08:41.461995 (+   178us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:41.461999 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:41.462118 (+   119us) server_negotiation.cc:408] Connection header received
0504 14:08:41.462286 (+   168us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:41.462289 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:41.462338 (+    49us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:41.462411 (+    73us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:41.462959 (+   548us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:41.463476 (+   517us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:41.464119 (+   643us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:41.464254 (+   135us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:41.466697 (+  2443us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.194:36357: BlockingRecv error: recv got EOF from 127.25.254.194:36357 (error 108)
Metrics: {"server-negotiator.queue_time_us":73}
May 04 14:08:41 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:41 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:41.655247  4033 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:41.649819 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.195:47231)
0504 14:08:41.650144 (+   325us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:41.650208 (+    64us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:41.650340 (+   132us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:41.650759 (+   419us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:41.650762 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:41.650950 (+   188us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:41.651240 (+   290us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:41.651247 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:41.652030 (+   783us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:41.652033 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:41.652407 (+   374us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:41.652413 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:41.652541 (+   128us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:41.653100 (+   559us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:41.653114 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:41.655075 (+  1961us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":229,"thread_start_us":141,"threads_started":1}
W20260504 14:08:41.655292  4031 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:41.650014 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:47231 (local address 127.25.254.254:35629)
0504 14:08:41.650196 (+   182us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:41.650202 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:41.650297 (+    95us) server_negotiation.cc:408] Connection header received
0504 14:08:41.650483 (+   186us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:41.650489 (+     6us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:41.650560 (+    71us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:41.650652 (+    92us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:41.651365 (+   713us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:41.651877 (+   512us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:41.652525 (+   648us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:41.652735 (+   210us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:41.655191 (+  2456us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.195:47231: BlockingRecv error: recv got EOF from 127.25.254.195:47231 (error 108)
Metrics: {"server-negotiator.queue_time_us":75}
May 04 14:08:42 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:42 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:42.271047  4034 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:42.265482 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.193:36033)
0504 14:08:42.265814 (+   332us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:42.265834 (+    20us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:42.265944 (+   110us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:42.266367 (+   423us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:42.266370 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:42.266637 (+   267us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:42.266855 (+   218us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:42.266862 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:42.267652 (+   790us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:42.267659 (+     7us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:42.268071 (+   412us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:42.268078 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:42.268220 (+   142us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:42.268841 (+   621us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:42.268855 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:42.270925 (+  2070us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":221,"thread_start_us":78,"threads_started":1}
W20260504 14:08:42.271353  4035 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:42.265658 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:36033 (local address 127.25.254.254:35629)
0504 14:08:42.265923 (+   265us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:42.265927 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:42.265950 (+    23us) server_negotiation.cc:408] Connection header received
0504 14:08:42.266025 (+    75us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:42.266028 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:42.266071 (+    43us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:42.266175 (+   104us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:42.266982 (+   807us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:42.267530 (+   548us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:42.268204 (+   674us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:42.268424 (+   220us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:42.271067 (+  2643us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.193:36033: BlockingRecv error: recv got EOF from 127.25.254.193:36033 (error 108)
Metrics: {"server-negotiator.queue_time_us":188,"thread_start_us":91,"threads_started":1}
May 04 14:08:42 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:42 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:42.474074  4036 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:42.467671 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.194:43517)
0504 14:08:42.468028 (+   357us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:42.468042 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:42.468144 (+   102us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:42.468513 (+   369us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:42.468517 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:42.468807 (+   290us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:42.469024 (+   217us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:42.469030 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:42.469943 (+   913us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:42.469946 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:42.470469 (+   523us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:42.470478 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:42.470586 (+   108us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:42.471186 (+   600us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:42.471200 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:42.473877 (+  2677us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":244,"thread_start_us":153,"threads_started":1}
W20260504 14:08:42.474131  4035 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:42.467904 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:43517 (local address 127.25.254.254:35629)
0504 14:08:42.468047 (+   143us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:42.468050 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:42.468113 (+    63us) server_negotiation.cc:408] Connection header received
0504 14:08:42.468282 (+   169us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:42.468286 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:42.468340 (+    54us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:42.468432 (+    92us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:42.469230 (+   798us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:42.469796 (+   566us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:42.470610 (+   814us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:42.470846 (+   236us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:42.474014 (+  3168us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.194:43517: BlockingRecv error: recv got EOF from 127.25.254.194:43517 (error 108)
Metrics: {"server-negotiator.queue_time_us":56}
May 04 14:08:42 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:42 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:42.661190  4037 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:42.656184 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.195:34287)
0504 14:08:42.656518 (+   334us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:42.656534 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:42.656663 (+   129us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:42.656975 (+   312us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:42.656979 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:42.657200 (+   221us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:42.657402 (+   202us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:42.657407 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:42.658143 (+   736us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:42.658147 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:42.658560 (+   413us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:42.658567 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:42.658697 (+   130us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:42.659250 (+   553us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:42.659264 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:42.661026 (+  1762us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":258,"thread_start_us":107,"threads_started":1}
W20260504 14:08:42.661288  4035 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:42.656361 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:34287 (local address 127.25.254.254:35629)
0504 14:08:42.656512 (+   151us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:42.656516 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:42.656653 (+   137us) server_negotiation.cc:408] Connection header received
0504 14:08:42.656768 (+   115us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:42.656771 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:42.656828 (+    57us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:42.656926 (+    98us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:42.657524 (+   598us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:42.658029 (+   505us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:42.658678 (+   649us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:42.658884 (+   206us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:42.661171 (+  2287us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.195:34287: BlockingRecv error: recv got EOF from 127.25.254.195:34287 (error 108)
Metrics: {"server-negotiator.queue_time_us":42}
May 04 14:08:43 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:43 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:43.279274  4038 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:43.271951 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.193:35031)
0504 14:08:43.272269 (+   318us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:43.272287 (+    18us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:43.272411 (+   124us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:43.272857 (+   446us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:43.272862 (+     5us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:43.273146 (+   284us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:43.273447 (+   301us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:43.273458 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:43.274764 (+  1306us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:43.274768 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:43.275416 (+   648us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:43.275427 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:43.275587 (+   160us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:43.276337 (+   750us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:43.276352 (+    15us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:43.279110 (+  2758us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":200,"thread_start_us":96,"threads_started":1}
W20260504 14:08:43.279471  4039 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:43.272089 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:35031 (local address 127.25.254.254:35629)
0504 14:08:43.272344 (+   255us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:43.272350 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:43.272369 (+    19us) server_negotiation.cc:408] Connection header received
0504 14:08:43.272540 (+   171us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:43.272543 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:43.272617 (+    74us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:43.272851 (+   234us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:43.273594 (+   743us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:43.274609 (+  1015us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:43.275552 (+   943us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:43.275738 (+   186us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:43.279324 (+  3586us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.193:35031: BlockingRecv error: recv got EOF from 127.25.254.193:35031 (error 108)
Metrics: {"server-negotiator.queue_time_us":168,"thread_start_us":117,"threads_started":1}
May 04 14:08:43 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:43 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:43.479936  4040 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:43.474892 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.194:40365)
0504 14:08:43.475177 (+   285us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:43.475190 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:43.475293 (+   103us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:43.475526 (+   233us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:43.475529 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:43.475708 (+   179us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:43.475906 (+   198us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:43.475912 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:43.476632 (+   720us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:43.476634 (+     2us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:43.477207 (+   573us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:43.477229 (+    22us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:43.477340 (+   111us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:43.477898 (+   558us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:43.477911 (+    13us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:43.479817 (+  1906us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":211,"thread_start_us":77,"threads_started":1}
W20260504 14:08:43.480192  4039 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:43.475010 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:40365 (local address 127.25.254.254:35629)
0504 14:08:43.475175 (+   165us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:43.475179 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:43.475262 (+    83us) server_negotiation.cc:408] Connection header received
0504 14:08:43.475374 (+   112us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:43.475377 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:43.475423 (+    46us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:43.475489 (+    66us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:43.476030 (+   541us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:43.476526 (+   496us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:43.477339 (+   813us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:43.477469 (+   130us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:43.480066 (+  2597us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.194:40365: BlockingRecv error: recv got EOF from 127.25.254.194:40365 (error 108)
Metrics: {"server-negotiator.queue_time_us":72}
May 04 14:08:43 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:43 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:43.668098  4041 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:43.662237 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.195:59375)
0504 14:08:43.662511 (+   274us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:43.662525 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:43.662637 (+   112us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:43.662905 (+   268us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:43.662907 (+     2us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:43.663121 (+   214us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:43.663315 (+   194us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:43.663320 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:43.664252 (+   932us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:43.664258 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:43.664758 (+   500us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:43.664767 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:43.664898 (+   131us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:43.665900 (+  1002us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:43.665914 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:43.667914 (+  2000us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":187,"thread_start_us":72,"threads_started":1}
W20260504 14:08:43.668149  4039 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:43.662333 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:59375 (local address 127.25.254.254:35629)
0504 14:08:43.662519 (+   186us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:43.662523 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:43.662601 (+    78us) server_negotiation.cc:408] Connection header received
0504 14:08:43.662722 (+   121us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:43.662725 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:43.662778 (+    53us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:43.662853 (+    75us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:43.663456 (+   603us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:43.664120 (+   664us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:43.664891 (+   771us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:43.665116 (+   225us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:43.668043 (+  2927us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.195:59375: BlockingRecv error: recv got EOF from 127.25.254.195:59375 (error 108)
Metrics: {"server-negotiator.queue_time_us":71}
May 04 14:08:44 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:44 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:44.285801  4042 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:44.280213 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.193:55967)
0504 14:08:44.280552 (+   339us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:44.280567 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:44.280677 (+   110us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:44.280948 (+   271us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:44.280950 (+     2us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:44.281143 (+   193us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:44.281329 (+   186us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:44.281334 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:44.282221 (+   887us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:44.282225 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:44.282619 (+   394us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:44.282626 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:44.282736 (+   110us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:44.283379 (+   643us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:44.283393 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:44.285529 (+  2136us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":257,"thread_start_us":76,"threads_started":1}
W20260504 14:08:44.285974  4043 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:44.280284 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:55967 (local address 127.25.254.254:35629)
0504 14:08:44.280582 (+   298us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:44.280586 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:44.280697 (+   111us) server_negotiation.cc:408] Connection header received
0504 14:08:44.280778 (+    81us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:44.280781 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:44.280831 (+    50us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:44.280909 (+    78us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:44.281464 (+   555us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:44.282058 (+   594us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:44.282814 (+   756us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:44.282983 (+   169us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:44.285829 (+  2846us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.193:55967: BlockingRecv error: recv got EOF from 127.25.254.193:55967 (error 108)
Metrics: {"server-negotiator.queue_time_us":216,"thread_start_us":105,"threads_started":1}
May 04 14:08:44 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:44 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:44.486338  4044 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:44.480759 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.194:60039)
0504 14:08:44.481088 (+   329us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:44.481109 (+    21us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:44.481213 (+   104us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:44.481503 (+   290us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:44.481506 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:44.481688 (+   182us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:44.481937 (+   249us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:44.481942 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:44.482762 (+   820us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:44.482767 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:44.483216 (+   449us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:44.483223 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:44.483335 (+   112us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:44.484017 (+   682us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:44.484032 (+    15us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:44.486117 (+  2085us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":225,"thread_start_us":107,"threads_started":1}
W20260504 14:08:44.486418  4043 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:44.480857 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:60039 (local address 127.25.254.254:35629)
0504 14:08:44.480999 (+   142us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:44.481003 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:44.481177 (+   174us) server_negotiation.cc:408] Connection header received
0504 14:08:44.481332 (+   155us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:44.481335 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:44.481384 (+    49us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:44.481456 (+    72us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:44.482089 (+   633us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:44.482618 (+   529us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:44.483362 (+   744us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:44.483513 (+   151us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:44.486320 (+  2807us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.194:60039: BlockingRecv error: recv got EOF from 127.25.254.194:60039 (error 108)
Metrics: {"server-negotiator.queue_time_us":58}
May 04 14:08:44 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:44 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:44.675143  4045 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:44.669050 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.195:46429)
0504 14:08:44.669460 (+   410us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:44.669476 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:44.669633 (+   157us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:44.669992 (+   359us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:44.669996 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:44.670237 (+   241us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:44.670426 (+   189us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:44.670432 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:44.671242 (+   810us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:44.671246 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:44.671653 (+   407us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:44.671670 (+    17us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:44.671810 (+   140us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:44.672457 (+   647us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:44.672471 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:44.674970 (+  2499us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":327,"thread_start_us":113,"threads_started":1}
W20260504 14:08:44.675196  4043 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:44.669225 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:46429 (local address 127.25.254.254:35629)
0504 14:08:44.669424 (+   199us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:44.669429 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:44.669566 (+   137us) server_negotiation.cc:408] Connection header received
0504 14:08:44.669811 (+   245us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:44.669815 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:44.669872 (+    57us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:44.669972 (+   100us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:44.670558 (+   586us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:44.671109 (+   551us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:44.671790 (+   681us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:44.672017 (+   227us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:44.675090 (+  3073us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.195:46429: BlockingRecv error: recv got EOF from 127.25.254.195:46429 (error 108)
Metrics: {"server-negotiator.queue_time_us":88}
May 04 14:08:45 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:45 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:45.291836  4046 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:45.286709 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.193:60765)
0504 14:08:45.286990 (+   281us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:45.287005 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:45.287099 (+    94us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:45.287419 (+   320us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:45.287422 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:45.287602 (+   180us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:45.287817 (+   215us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:45.287822 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:45.288594 (+   772us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:45.288597 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:45.289037 (+   440us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:45.289044 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:45.289177 (+   133us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:45.289799 (+   622us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:45.289813 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:45.291705 (+  1892us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":182,"thread_start_us":97,"threads_started":1}
W20260504 14:08:45.292196  4047 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:45.286920 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:60765 (local address 127.25.254.254:35629)
0504 14:08:45.287177 (+   257us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:45.287180 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:45.287193 (+    13us) server_negotiation.cc:408] Connection header received
0504 14:08:45.287233 (+    40us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:45.287236 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:45.287281 (+    45us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:45.287367 (+    86us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:45.287943 (+   576us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:45.288463 (+   520us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:45.289167 (+   704us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:45.289382 (+   215us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:45.292071 (+  2689us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.193:60765: BlockingRecv error: recv got EOF from 127.25.254.193:60765 (error 108)
Metrics: {"server-negotiator.queue_time_us":183,"thread_start_us":111,"threads_started":1}
May 04 14:08:45 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:45 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:45.493309  4048 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:45.487487 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.194:34019)
0504 14:08:45.487871 (+   384us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:45.487890 (+    19us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:45.488019 (+   129us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:45.488330 (+   311us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:45.488332 (+     2us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:45.488519 (+   187us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:45.488718 (+   199us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:45.488723 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:45.489752 (+  1029us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:45.489756 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:45.490203 (+   447us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:45.490210 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:45.490323 (+   113us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:45.490923 (+   600us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:45.490939 (+    16us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:45.493061 (+  2122us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":215,"thread_start_us":125,"threads_started":1}
W20260504 14:08:45.493374  4047 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:45.487614 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:34019 (local address 127.25.254.254:35629)
0504 14:08:45.487786 (+   172us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:45.487789 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:45.488025 (+   236us) server_negotiation.cc:408] Connection header received
0504 14:08:45.488101 (+    76us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:45.488104 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:45.488158 (+    54us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:45.488237 (+    79us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:45.488853 (+   616us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:45.489557 (+   704us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:45.490352 (+   795us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:45.490498 (+   146us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:45.493250 (+  2752us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.194:34019: BlockingRecv error: recv got EOF from 127.25.254.194:34019 (error 108)
Metrics: {"server-negotiator.queue_time_us":80}
May 04 14:08:45 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:45 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:45.681521  4049 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:45.676026 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.195:47351)
0504 14:08:45.676335 (+   309us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:45.676348 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:45.676456 (+   108us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:45.676763 (+   307us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:45.676765 (+     2us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:45.676997 (+   232us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:45.677293 (+   296us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:45.677301 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:45.678383 (+  1082us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:45.678386 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:45.678774 (+   388us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:45.678781 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:45.678876 (+    95us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:45.679379 (+   503us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:45.679392 (+    13us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:45.681403 (+  2011us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":213,"thread_start_us":124,"threads_started":1}
W20260504 14:08:45.681670  4047 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:45.676120 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:47351 (local address 127.25.254.254:35629)
0504 14:08:45.676297 (+   177us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:45.676301 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:45.676433 (+   132us) server_negotiation.cc:408] Connection header received
0504 14:08:45.676559 (+   126us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:45.676562 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:45.676611 (+    49us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:45.676723 (+   112us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:45.677443 (+   720us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:45.678269 (+   826us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:45.678879 (+   610us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:45.679013 (+   134us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:45.681542 (+  2529us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.195:47351: BlockingRecv error: recv got EOF from 127.25.254.195:47351 (error 108)
Metrics: {"server-negotiator.queue_time_us":53}
May 04 14:08:46 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:46 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:46.302738  4050 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:46.292625 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.193:56759)
0504 14:08:46.292960 (+   335us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:46.292975 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:46.293071 (+    96us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:46.293383 (+   312us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:46.293386 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:46.293574 (+   188us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:46.293796 (+   222us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:46.293803 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:46.294614 (+   811us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:46.294617 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:46.295015 (+   398us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:46.295023 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:46.295118 (+    95us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:46.300375 (+  5257us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:46.300394 (+    19us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:46.302568 (+  2174us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":245,"thread_start_us":162,"threads_started":1}
W20260504 14:08:46.303017  4051 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:46.292722 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:56759 (local address 127.25.254.254:35629)
0504 14:08:46.293002 (+   280us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:46.293005 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:46.293021 (+    16us) server_negotiation.cc:408] Connection header received
0504 14:08:46.293165 (+   144us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:46.293167 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:46.293214 (+    47us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:46.293300 (+    86us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:46.293945 (+   645us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:46.294507 (+   562us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:46.295133 (+   626us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:46.295272 (+   139us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:46.302721 (+  7449us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.193:56759: BlockingRecv error: recv got EOF from 127.25.254.193:56759 (error 108)
Metrics: {"server-negotiator.queue_time_us":212,"thread_start_us":128,"threads_started":1}
May 04 14:08:46 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:46 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:46.499873  4052 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:46.494320 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.194:52123)
0504 14:08:46.494633 (+   313us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:46.494647 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:46.494791 (+   144us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:46.495112 (+   321us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:46.495115 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:46.495381 (+   266us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:46.495672 (+   291us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:46.495681 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:46.496501 (+   820us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:46.496505 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:46.496894 (+   389us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:46.496901 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:46.497018 (+   117us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:46.497593 (+   575us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:46.497608 (+    15us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:46.499698 (+  2090us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":219,"thread_start_us":120,"threads_started":1}
W20260504 14:08:46.499938  4051 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:46.494475 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:52123 (local address 127.25.254.254:35629)
0504 14:08:46.494631 (+   156us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:46.494634 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:46.494750 (+   116us) server_negotiation.cc:408] Connection header received
0504 14:08:46.494909 (+   159us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:46.494912 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:46.494966 (+    54us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:46.495077 (+   111us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:46.495832 (+   755us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:46.496344 (+   512us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:46.497027 (+   683us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:46.497192 (+   165us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:46.499828 (+  2636us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.194:52123: BlockingRecv error: recv got EOF from 127.25.254.194:52123 (error 108)
Metrics: {"server-negotiator.queue_time_us":67}
May 04 14:08:46 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:46 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:46.688131  4053 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:46.682546 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.195:36781)
0504 14:08:46.682904 (+   358us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:46.682924 (+    20us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:46.683047 (+   123us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:46.683310 (+   263us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:46.683313 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:46.683500 (+   187us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:46.683830 (+   330us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:46.683840 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:46.684663 (+   823us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:46.684667 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:46.685063 (+   396us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:46.685069 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:46.685209 (+   140us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:46.685815 (+   606us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:46.685828 (+    13us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:46.687925 (+  2097us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":274,"thread_start_us":176,"threads_started":1}
W20260504 14:08:46.688169  4051 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:46.682630 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:36781 (local address 127.25.254.254:35629)
0504 14:08:46.682802 (+   172us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:46.682807 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:46.683038 (+   231us) server_negotiation.cc:408] Connection header received
0504 14:08:46.683122 (+    84us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:46.683124 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:46.683172 (+    48us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:46.683259 (+    87us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:46.684024 (+   765us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:46.684522 (+   498us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:46.685192 (+   670us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:46.685411 (+   219us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:46.688065 (+  2654us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.195:36781: BlockingRecv error: recv got EOF from 127.25.254.195:36781 (error 108)
Metrics: {"server-negotiator.queue_time_us":61}
May 04 14:08:47 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:47 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:47.309506  4054 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:47.303714 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.193:49525)
0504 14:08:47.304099 (+   385us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:47.304114 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:47.304208 (+    94us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:47.304626 (+   418us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:47.304629 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:47.304858 (+   229us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:47.305047 (+   189us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:47.305053 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:47.305956 (+   903us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:47.305960 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:47.306463 (+   503us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:47.306474 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:47.306595 (+   121us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:47.307279 (+   684us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:47.307293 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:47.309377 (+  2084us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":302,"thread_start_us":94,"threads_started":1}
W20260504 14:08:47.309679  4055 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:47.303856 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:49525 (local address 127.25.254.254:35629)
0504 14:08:47.304151 (+   295us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:47.304154 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:47.304168 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:08:47.304337 (+   169us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:47.304340 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:47.304388 (+    48us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:47.304485 (+    97us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:47.305183 (+   698us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:47.305828 (+   645us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:47.306623 (+   795us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:47.306909 (+   286us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:47.309528 (+  2619us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.193:49525: BlockingRecv error: recv got EOF from 127.25.254.193:49525 (error 108)
Metrics: {"server-negotiator.queue_time_us":228,"thread_start_us":76,"threads_started":1}
May 04 14:08:47 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:47 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:47.506006  4056 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:47.500774 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.194:37525)
0504 14:08:47.501048 (+   274us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:47.501066 (+    18us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:47.501165 (+    99us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:47.501449 (+   284us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:47.501453 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:47.501655 (+   202us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:47.501910 (+   255us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:47.501916 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:47.502773 (+   857us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:47.502776 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:47.503172 (+   396us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:47.503179 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:47.503277 (+    98us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:47.503827 (+   550us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:47.503840 (+    13us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:47.505854 (+  2014us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":187,"thread_start_us":101,"threads_started":1}
W20260504 14:08:47.506165  4055 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:47.501002 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:37525 (local address 127.25.254.254:35629)
0504 14:08:47.501143 (+   141us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:47.501147 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:47.501161 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:08:47.501248 (+    87us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:47.501251 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:47.501299 (+    48us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:47.501376 (+    77us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:47.502063 (+   687us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:47.502609 (+   546us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:47.503330 (+   721us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:47.503547 (+   217us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:47.506026 (+  2479us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.194:37525: BlockingRecv error: recv got EOF from 127.25.254.194:37525 (error 108)
Metrics: {"server-negotiator.queue_time_us":57}
May 04 14:08:47 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:47 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:47.695024  4057 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:47.689032 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.195:43613)
0504 14:08:47.689353 (+   321us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:47.689368 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:47.689451 (+    83us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:47.689823 (+   372us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:47.689826 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:47.690016 (+   190us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:47.690250 (+   234us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:47.690257 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:47.691294 (+  1037us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:47.691297 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:47.691749 (+   452us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:47.691757 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:47.691867 (+   110us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:47.692488 (+   621us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:47.692502 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:47.694815 (+  2313us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":219,"thread_start_us":126,"threads_started":1}
W20260504 14:08:47.695050  4055 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:47.689212 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:43613 (local address 127.25.254.254:35629)
0504 14:08:47.689391 (+   179us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:47.689395 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:47.689435 (+    40us) server_negotiation.cc:408] Connection header received
0504 14:08:47.689605 (+   170us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:47.689608 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:47.689656 (+    48us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:47.689741 (+    85us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:47.690367 (+   626us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:47.691145 (+   778us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:47.691894 (+   749us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:47.692059 (+   165us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:47.694934 (+  2875us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.195:43613: BlockingRecv error: recv got EOF from 127.25.254.195:43613 (error 108)
Metrics: {"server-negotiator.queue_time_us":88}
May 04 14:08:48 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:48 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:48.315564  4058 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:48.310430 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.193:60745)
0504 14:08:48.310747 (+   317us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:48.310761 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:48.310855 (+    94us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:48.311123 (+   268us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:48.311126 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:48.311311 (+   185us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:48.311498 (+   187us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:48.311503 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:48.312326 (+   823us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:48.312329 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:48.312704 (+   375us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:48.312709 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:48.312797 (+    88us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:48.313366 (+   569us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:48.313379 (+    13us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:48.315389 (+  2010us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":223,"thread_start_us":87,"threads_started":1}
W20260504 14:08:48.315637  4059 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:48.310488 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:60745 (local address 127.25.254.254:35629)
0504 14:08:48.310802 (+   314us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:48.310806 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:48.310820 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:08:48.310945 (+   125us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:48.310948 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:48.311005 (+    57us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:48.311085 (+    80us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:48.311640 (+   555us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:48.312216 (+   576us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:48.312844 (+   628us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:48.313083 (+   239us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:48.315516 (+  2433us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.193:60745: BlockingRecv error: recv got EOF from 127.25.254.193:60745 (error 108)
Metrics: {"server-negotiator.queue_time_us":240,"thread_start_us":115,"threads_started":1}
May 04 14:08:48 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:48 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:48.512632  4060 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:48.507182 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.194:41869)
0504 14:08:48.507502 (+   320us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:48.507517 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:48.507635 (+   118us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:48.507932 (+   297us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:48.507935 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:48.508146 (+   211us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:48.508339 (+   193us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:48.508344 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:48.509220 (+   876us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:48.509224 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:48.509698 (+   474us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:48.509709 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:48.509850 (+   141us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:48.510508 (+   658us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:48.510523 (+    15us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:48.512519 (+  1996us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":245,"thread_start_us":126,"threads_started":1}
W20260504 14:08:48.512813  4059 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:48.507284 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:41869 (local address 127.25.254.254:35629)
0504 14:08:48.507451 (+   167us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:48.507457 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:48.507630 (+   173us) server_negotiation.cc:408] Connection header received
0504 14:08:48.507737 (+   107us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:48.507740 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:48.507801 (+    61us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:48.507872 (+    71us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:48.508489 (+   617us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:48.509074 (+   585us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:48.509860 (+   786us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:48.510009 (+   149us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:48.512647 (+  2638us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.194:41869: BlockingRecv error: recv got EOF from 127.25.254.194:41869 (error 108)
Metrics: {"server-negotiator.queue_time_us":67}
May 04 14:08:48 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:48 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:48.701303  4061 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:48.695990 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.195:48289)
0504 14:08:48.696264 (+   274us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:48.696277 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:48.696386 (+   109us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:48.696937 (+   551us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:48.696941 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:48.697137 (+   196us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:48.697331 (+   194us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:48.697336 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:48.698245 (+   909us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:48.698252 (+     7us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:48.698642 (+   390us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:48.698648 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:48.698748 (+   100us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:48.699316 (+   568us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:48.699329 (+    13us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:48.701174 (+  1845us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":193,"thread_start_us":99,"threads_started":1}
W20260504 14:08:48.701524  4059 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:48.696053 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:48289 (local address 127.25.254.254:35629)
0504 14:08:48.696215 (+   162us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:48.696220 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:48.696357 (+   137us) server_negotiation.cc:408] Connection header received
0504 14:08:48.696644 (+   287us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:48.696647 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:48.696761 (+   114us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:48.696844 (+    83us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:48.697467 (+   623us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:48.698046 (+   579us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:48.698833 (+   787us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:48.699057 (+   224us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:48.701385 (+  2328us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.195:48289: BlockingRecv error: recv got EOF from 127.25.254.195:48289 (error 108)
Metrics: {"server-negotiator.queue_time_us":64}
May 04 14:08:49 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:49 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:49.321952  4062 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:49.316485 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.193:55355)
0504 14:08:49.316821 (+   336us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:49.316837 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:49.316930 (+    93us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:49.317220 (+   290us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:49.317222 (+     2us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:49.317407 (+   185us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:49.317607 (+   200us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:49.317612 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:49.318456 (+   844us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:49.318459 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:49.318897 (+   438us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:49.318905 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:49.319053 (+   148us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:49.319658 (+   605us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:49.319674 (+    16us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:49.321780 (+  2106us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":197,"thread_start_us":103,"threads_started":1}
W20260504 14:08:49.322027  4063 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:49.316540 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:55355 (local address 127.25.254.254:35629)
0504 14:08:49.316885 (+   345us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:49.316888 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:49.316902 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:08:49.317025 (+   123us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:49.317027 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:49.317095 (+    68us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:49.317180 (+    85us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:49.317713 (+   533us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:49.318315 (+   602us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:49.319072 (+   757us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:49.319221 (+   149us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:49.321898 (+  2677us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.193:55355: BlockingRecv error: recv got EOF from 127.25.254.193:55355 (error 108)
Metrics: {"server-negotiator.queue_time_us":266,"thread_start_us":187,"threads_started":1}
May 04 14:08:49 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:49 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:49.519378  4064 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:49.513606 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.194:48299)
0504 14:08:49.514016 (+   410us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:49.514035 (+    19us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:49.514281 (+   246us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:49.514582 (+   301us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:49.514586 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:49.514856 (+   270us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:49.515160 (+   304us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:49.515172 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:49.516123 (+   951us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:49.516130 (+     7us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:49.516608 (+   478us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:49.516616 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:49.516717 (+   101us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:49.517250 (+   533us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:49.517263 (+    13us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:49.519205 (+  1942us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":289,"thread_start_us":133,"threads_started":1}
W20260504 14:08:49.519419  4063 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:49.513802 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:48299 (local address 127.25.254.254:35629)
0504 14:08:49.513994 (+   192us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:49.514000 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:49.514193 (+   193us) server_negotiation.cc:408] Connection header received
0504 14:08:49.514406 (+   213us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:49.514409 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:49.514464 (+    55us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:49.514556 (+    92us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:49.515301 (+   745us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:49.515975 (+   674us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:49.516719 (+   744us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:49.516871 (+   152us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:49.519321 (+  2450us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.194:48299: BlockingRecv error: recv got EOF from 127.25.254.194:48299 (error 108)
Metrics: {"server-negotiator.queue_time_us":85}
May 04 14:08:49 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:49 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:49.707167  4065 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:49.702205 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.195:52543)
0504 14:08:49.702499 (+   294us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:49.702512 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:49.702617 (+   105us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:49.702900 (+   283us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:49.702903 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:49.703095 (+   192us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:49.703287 (+   192us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:49.703292 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:49.703979 (+   687us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:49.703982 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:49.704358 (+   376us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:49.704364 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:49.704457 (+    93us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:49.705036 (+   579us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:49.705052 (+    16us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:49.706952 (+  1900us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":199,"thread_start_us":115,"threads_started":1}
W20260504 14:08:49.707267  4063 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:49.702351 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:52543 (local address 127.25.254.254:35629)
0504 14:08:49.702496 (+   145us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:49.702499 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:49.702595 (+    96us) server_negotiation.cc:408] Connection header received
0504 14:08:49.702704 (+   109us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:49.702706 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:49.702754 (+    48us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:49.702833 (+    79us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:49.703397 (+   564us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:49.703879 (+   482us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:49.704464 (+   585us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:49.704595 (+   131us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:49.707129 (+  2534us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.195:52543: BlockingRecv error: recv got EOF from 127.25.254.195:52543 (error 108)
Metrics: {"server-negotiator.queue_time_us":57}
May 04 14:08:50 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:50 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:50.327819  4066 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:50.322803 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.193:59251)
0504 14:08:50.323108 (+   305us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:50.323122 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:50.323217 (+    95us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:50.323458 (+   241us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:50.323461 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:50.323643 (+   182us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:50.323843 (+   200us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:50.323848 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:50.324555 (+   707us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:50.324558 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:50.324964 (+   406us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:50.324970 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:50.325064 (+    94us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:50.325630 (+   566us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:50.325643 (+    13us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:50.327643 (+  2000us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":223,"thread_start_us":93,"threads_started":1}
W20260504 14:08:50.327876  4067 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:50.322941 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:59251 (local address 127.25.254.254:35629)
0504 14:08:50.323196 (+   255us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:50.323199 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:50.323237 (+    38us) server_negotiation.cc:408] Connection header received
0504 14:08:50.323290 (+    53us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:50.323292 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:50.323337 (+    45us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:50.323420 (+    83us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:50.323949 (+   529us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:50.324459 (+   510us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:50.325086 (+   627us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:50.325224 (+   138us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:50.327761 (+  2537us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.193:59251: BlockingRecv error: recv got EOF from 127.25.254.193:59251 (error 108)
Metrics: {"server-negotiator.queue_time_us":189,"thread_start_us":62,"threads_started":1}
May 04 14:08:50 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:50 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:50.525983  4068 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:50.520370 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.194:45791)
0504 14:08:50.520737 (+   367us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:50.520755 (+    18us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:50.520912 (+   157us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:50.521198 (+   286us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:50.521201 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:50.521414 (+   213us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:50.521599 (+   185us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:50.521605 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:50.522544 (+   939us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:50.522549 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:50.523013 (+   464us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:50.523020 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:50.523156 (+   136us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:50.523693 (+   537us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:50.523706 (+    13us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:50.525864 (+  2158us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":268,"thread_start_us":125,"threads_started":1}
W20260504 14:08:50.526211  4067 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:50.520577 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:45791 (local address 127.25.254.254:35629)
0504 14:08:50.520738 (+   161us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:50.520743 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:50.520844 (+   101us) server_negotiation.cc:408] Connection header received
0504 14:08:50.521019 (+   175us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:50.521022 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:50.521080 (+    58us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:50.521173 (+    93us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:50.521733 (+   560us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:50.522422 (+   689us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:50.523137 (+   715us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:50.523344 (+   207us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:50.526028 (+  2684us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.194:45791: BlockingRecv error: recv got EOF from 127.25.254.194:45791 (error 108)
Metrics: {"server-negotiator.queue_time_us":64}
May 04 14:08:50 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:50 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:50.713406  4069 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:50.708101 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.195:52593)
0504 14:08:50.708498 (+   397us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:50.708515 (+    17us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:50.708668 (+   153us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:50.708988 (+   320us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:50.708991 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:50.709219 (+   228us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:50.709403 (+   184us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:50.709408 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:50.710218 (+   810us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:50.710223 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:50.710605 (+   382us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:50.710610 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:50.710735 (+   125us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:50.711287 (+   552us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:50.711300 (+    13us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:50.713240 (+  1940us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":321,"thread_start_us":93,"threads_started":1}
W20260504 14:08:50.713464  4067 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:50.708296 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:52593 (local address 127.25.254.254:35629)
0504 14:08:50.708481 (+   185us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:50.708485 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:50.708607 (+   122us) server_negotiation.cc:408] Connection header received
0504 14:08:50.708773 (+   166us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:50.708777 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:50.708864 (+    87us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:50.708964 (+   100us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:50.709529 (+   565us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:50.710049 (+   520us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:50.710720 (+   671us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:50.710930 (+   210us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:50.713355 (+  2425us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.195:52593: BlockingRecv error: recv got EOF from 127.25.254.195:52593 (error 108)
Metrics: {"server-negotiator.queue_time_us":82}
May 04 14:08:51 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:51 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:51.337381  4070 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:51.328800 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.193:43009)
0504 14:08:51.329149 (+   349us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:51.329164 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:51.329284 (+   120us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:51.329597 (+   313us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:51.329600 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:51.329787 (+   187us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:51.330001 (+   214us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:51.330007 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:51.330793 (+   786us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:51.330796 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:51.331335 (+   539us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:51.331344 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:51.331454 (+   110us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:51.335120 (+  3666us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:51.335141 (+    21us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:51.337209 (+  2068us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":261,"thread_start_us":104,"threads_started":1}
W20260504 14:08:51.337455  4071 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:51.328829 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:43009 (local address 127.25.254.254:35629)
0504 14:08:51.329179 (+   350us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:51.329182 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:51.329356 (+   174us) server_negotiation.cc:408] Connection header received
0504 14:08:51.329405 (+    49us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:51.329408 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:51.329456 (+    48us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:51.329542 (+    86us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:51.330124 (+   582us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:51.330655 (+   531us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:51.331502 (+   847us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:51.331664 (+   162us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:51.337332 (+  5668us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.193:43009: BlockingRecv error: recv got EOF from 127.25.254.193:43009 (error 108)
Metrics: {"server-negotiator.queue_time_us":282,"thread_start_us":65,"threads_started":1}
May 04 14:08:51 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:51 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:51.532505  4072 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:51.527123 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.194:56845)
0504 14:08:51.527429 (+   306us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:51.527442 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:51.527549 (+   107us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:51.527837 (+   288us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:51.527840 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:51.528046 (+   206us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:51.528261 (+   215us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:51.528267 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:51.529072 (+   805us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:51.529076 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:51.529457 (+   381us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:51.529463 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:51.529568 (+   105us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:51.530268 (+   700us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:51.530288 (+    20us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:51.532384 (+  2096us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":208,"thread_start_us":87,"threads_started":1}
W20260504 14:08:51.532734  4071 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:51.527206 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:56845 (local address 127.25.254.254:35629)
0504 14:08:51.527357 (+   151us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:51.527361 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:51.527559 (+   198us) server_negotiation.cc:408] Connection header received
0504 14:08:51.527617 (+    58us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:51.527620 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:51.527675 (+    55us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:51.527764 (+    89us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:51.528420 (+   656us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:51.528930 (+   510us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:51.529575 (+   645us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:51.529753 (+   178us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:51.532591 (+  2838us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.194:56845: BlockingRecv error: recv got EOF from 127.25.254.194:56845 (error 108)
Metrics: {"server-negotiator.queue_time_us":54}
May 04 14:08:51 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:51 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.195@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:51.719609  4073 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:51.714336 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.195:43525)
0504 14:08:51.714704 (+   368us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:51.714717 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:51.714867 (+   150us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:51.715152 (+   285us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:51.715155 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:51.715367 (+   212us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:51.715553 (+   186us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:51.715559 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:51.716326 (+   767us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:51.716332 (+     6us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:51.716718 (+   386us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:51.716724 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:51.716856 (+   132us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:51.717430 (+   574us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:51.717444 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:51.719433 (+  1989us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":298,"thread_start_us":82,"threads_started":1}
W20260504 14:08:51.719666  4071 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:51.714391 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:43525 (local address 127.25.254.254:35629)
0504 14:08:51.714552 (+   161us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:51.714556 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:51.714804 (+   248us) server_negotiation.cc:408] Connection header received
0504 14:08:51.714987 (+   183us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:51.714990 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:51.715039 (+    49us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:51.715127 (+    88us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:51.715694 (+   567us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:51.716207 (+   513us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:51.716838 (+   631us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:51.717054 (+   216us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:51.719558 (+  2504us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.195:43525: BlockingRecv error: recv got EOF from 127.25.254.195:43525 (error 108)
Metrics: {"server-negotiator.queue_time_us":53}
May 04 14:08:52 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:52 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.193@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:52.343886  4074 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:52.338446 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.193:41199)
0504 14:08:52.338736 (+   290us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:52.338752 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:52.338858 (+   106us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:52.339161 (+   303us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:52.339164 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:52.339376 (+   212us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:52.339568 (+   192us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:52.339573 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:52.340461 (+   888us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:52.340465 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:52.340858 (+   393us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:52.340865 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:52.341002 (+   137us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:52.341612 (+   610us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:52.341627 (+    15us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:52.343672 (+  2045us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":194,"thread_start_us":103,"threads_started":1}
W20260504 14:08:52.343923  4075 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:52.338628 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:41199 (local address 127.25.254.254:35629)
0504 14:08:52.338885 (+   257us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:52.338889 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:52.338903 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:08:52.338947 (+    44us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:52.338950 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:52.338998 (+    48us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:52.339111 (+   113us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:52.339708 (+   597us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:52.340323 (+   615us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:52.340985 (+   662us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:52.341209 (+   224us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:52.343805 (+  2596us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.193:41199: BlockingRecv error: recv got EOF from 127.25.254.193:41199 (error 108)
Metrics: {"server-negotiator.queue_time_us":179,"thread_start_us":92,"threads_started":1}
May 04 14:08:52 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
May 04 14:08:52 dist-test-slave-2x32 krb5kdc[3008](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: LOOKING_UP_SERVER: authtime 0,  oryx/127.25.254.194@KRBTEST.COM for oryx/127.25.254.254@KRBTEST.COM, Server not found in Kerberos database
I20260504 14:08:52.539565  4076 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:52.533621 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:35629 (local address 127.25.254.194:56007)
0504 14:08:52.534010 (+   389us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:52.534024 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:52.534284 (+   260us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:52.534700 (+   416us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:52.534704 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:52.534931 (+   227us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:52.535207 (+   276us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:52.535219 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:52.535978 (+   759us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:52.535981 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:52.536384 (+   403us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:52.536391 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:52.536486 (+    95us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:52.537273 (+   787us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:52.537287 (+    14us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:52.539441 (+  2154us) negotiation.cc:326] Negotiation complete: Not authorized: Client connection negotiation failed: client connection to 127.25.254.254:35629: Server oryx/127.25.254.254@KRBTEST.COM not found in Kerberos database
Metrics: {"client-negotiator.queue_time_us":263,"thread_start_us":155,"threads_started":1}
W20260504 14:08:52.539898  4075 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:52.533755 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:56007 (local address 127.25.254.254:35629)
0504 14:08:52.533972 (+   217us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:52.533976 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:52.534143 (+   167us) server_negotiation.cc:408] Connection header received
0504 14:08:52.534428 (+   285us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:52.534431 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:52.534494 (+    63us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:52.534592 (+    98us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:52.535347 (+   755us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:52.535863 (+   516us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:52.536515 (+   652us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:52.536663 (+   148us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:52.539772 (+  3109us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.25.254.194:56007: BlockingRecv error: recv got EOF from 127.25.254.194:56007 (error 108)
Metrics: {"server-negotiator.queue_time_us":114}
I20260504 14:08:52.606120 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 3600
I20260504 14:08:52.612594 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 3752
I20260504 14:08:52.618589 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 3884
I20260504 14:08:52.624598 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 3530
2026-05-04T14:08:52Z chronyd exiting
[       OK ] SecurityITest.TestMismatchingPrincipals (19091 ms)
[ RUN      ] SecurityITest.TestRequireAuthenticationInsecureCluster
2026-05-04T14:08:52Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:08:52Z Disabled control of system clock
I20260504 14:08:52.655911 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:44643
--webserver_interface=127.25.254.254
--webserver_port=0
--builtin_ntp_servers=127.25.254.212:46869
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:44643
--encrypt_data_at_rest=true
--rpc_trace_negotiation with env {}
W20260504 14:08:52.764667  4080 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:52.764925  4080 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:52.764968  4080 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:52.768493  4080 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:08:52.768579  4080 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:52.768595  4080 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:52.768616  4080 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:08:52.768635  4080 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:08:52.772791  4080 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:46869
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:44643
--rpc_trace_negotiation=true
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:44643
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.4080
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:52.773911  4080 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:52.775099  4080 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:52.781260  4088 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:52.781335  4086 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:52.781518  4080 server_base.cc:1061] running on GCE node
W20260504 14:08:52.781260  4085 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:52.782279  4080 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:52.783253  4080 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:52.784418  4080 hybrid_clock.cc:648] HybridClock initialized: now 1777903732784396 us; error 39 us; skew 500 ppm
I20260504 14:08:52.786545  4080 webserver.cc:492] Webserver started at http://127.25.254.254:34459/ using document root <none> and password file <none>
I20260504 14:08:52.787169  4080 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:52.787226  4080 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:52.787451  4080 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:52.789214  4080 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "e991725f7de740e89001e42c86041b7b"
format_stamp: "Formatted at 2026-05-04 14:08:52 on dist-test-slave-2x32"
server_key: "40a200b80dfc71b3c2da3bebc617dda0"
server_key_iv: "53b55a815f524c830cfad8c41b8301c0"
server_key_version: "encryptionkey@0"
I20260504 14:08:52.789752  4080 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "e991725f7de740e89001e42c86041b7b"
format_stamp: "Formatted at 2026-05-04 14:08:52 on dist-test-slave-2x32"
server_key: "40a200b80dfc71b3c2da3bebc617dda0"
server_key_iv: "53b55a815f524c830cfad8c41b8301c0"
server_key_version: "encryptionkey@0"
I20260504 14:08:52.793274  4080 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.006s	sys 0.000s
I20260504 14:08:52.795691  4094 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:52.796868  4080 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.001s	sys 0.000s
I20260504 14:08:52.797017  4080 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "e991725f7de740e89001e42c86041b7b"
format_stamp: "Formatted at 2026-05-04 14:08:52 on dist-test-slave-2x32"
server_key: "40a200b80dfc71b3c2da3bebc617dda0"
server_key_iv: "53b55a815f524c830cfad8c41b8301c0"
server_key_version: "encryptionkey@0"
I20260504 14:08:52.797135  4080 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:52.817071  4080 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:52.817811  4080 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:52.818017  4080 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:52.826083  4080 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:44643
I20260504 14:08:52.826108  4146 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:44643 every 8 connection(s)
I20260504 14:08:52.827163  4080 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:08:52.830063  4147 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:52.832314 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 4080
I20260504 14:08:52.832470 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:08:52.832734 26619 external_mini_cluster.cc:1468] Setting key 6a882a9227d65b99e8f011c1ec3df78a
I20260504 14:08:52.836228  4147 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b: Bootstrap starting.
I20260504 14:08:52.838515  4147 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:52.839394  4147 log.cc:826] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:52.840744  4150 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:52.834084 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:53122 (local address 127.25.254.254:44643)
0504 14:08:52.834757 (+   673us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:52.834765 (+     8us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:52.834801 (+    36us) server_negotiation.cc:408] Connection header received
0504 14:08:52.835455 (+   654us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:52.835474 (+    19us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:52.835891 (+   417us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:52.836242 (+   351us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:52.836761 (+   519us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:52.837528 (+   767us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:52.838307 (+   779us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:52.838620 (+   313us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:52.839185 (+   565us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:52.839214 (+    29us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:52.839228 (+    14us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:08:52.839642 (+   414us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:52.839678 (+    36us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:08:52.839691 (+    13us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:52.839811 (+   120us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:52.839996 (+   185us) server_negotiation.cc:300] Negotiation successful
0504 14:08:52.840161 (+   165us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":357,"thread_start_us":162,"threads_started":1}
I20260504 14:08:52.841444  4147 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b: No bootstrap required, opened a new log
I20260504 14:08:52.844259  4147 raft_consensus.cc:359] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "e991725f7de740e89001e42c86041b7b" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44643 } }
I20260504 14:08:52.844482  4147 raft_consensus.cc:385] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:52.844553  4147 raft_consensus.cc:740] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: e991725f7de740e89001e42c86041b7b, State: Initialized, Role: FOLLOWER
I20260504 14:08:52.844969  4147 consensus_queue.cc:260] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "e991725f7de740e89001e42c86041b7b" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44643 } }
I20260504 14:08:52.845091  4147 raft_consensus.cc:399] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:08:52.845170  4147 raft_consensus.cc:493] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:08:52.845257  4147 raft_consensus.cc:3060] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:52.846131  4147 raft_consensus.cc:515] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "e991725f7de740e89001e42c86041b7b" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44643 } }
I20260504 14:08:52.846513  4147 leader_election.cc:304] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: e991725f7de740e89001e42c86041b7b; no voters: 
I20260504 14:08:52.846849  4147 leader_election.cc:290] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:08:52.847030  4152 raft_consensus.cc:2804] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:08:52.847289  4152 raft_consensus.cc:697] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b [term 1 LEADER]: Becoming Leader. State: Replica: e991725f7de740e89001e42c86041b7b, State: Running, Role: LEADER
I20260504 14:08:52.847581  4152 consensus_queue.cc:237] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "e991725f7de740e89001e42c86041b7b" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44643 } }
I20260504 14:08:52.848099  4147 sys_catalog.cc:565] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:08:52.849181  4154 sys_catalog.cc:455] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b [sys.catalog]: SysCatalogTable state changed. Reason: New leader e991725f7de740e89001e42c86041b7b. Latest consensus state: current_term: 1 leader_uuid: "e991725f7de740e89001e42c86041b7b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "e991725f7de740e89001e42c86041b7b" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44643 } } }
I20260504 14:08:52.849321  4154 sys_catalog.cc:458] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:52.849725  4161 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:08:52.849910  4153 sys_catalog.cc:455] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "e991725f7de740e89001e42c86041b7b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "e991725f7de740e89001e42c86041b7b" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 44643 } } }
I20260504 14:08:52.850003  4153 sys_catalog.cc:458] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:52.853251  4161 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:08:52.859524  4161 catalog_manager.cc:1357] Generated new cluster ID: a743c7d61dcd493fbe4c6861b69cfb04
I20260504 14:08:52.859632  4161 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:08:52.876935  4161 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:08:52.877942  4161 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:08:52.894230  4161 catalog_manager.cc:6044] T 00000000000000000000000000000000 P e991725f7de740e89001e42c86041b7b: Generated new TSK 0
I20260504 14:08:52.895061  4161 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20260504 14:08:52.908793 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:0
--local_ip_for_outbound_sockets=127.25.254.193
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:44643
--builtin_ntp_servers=127.25.254.212:46869
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {}
W20260504 14:08:53.028789  4171 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:53.029201  4171 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:53.029348  4171 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:53.032979  4171 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:53.033097  4171 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:53.033209  4171 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:08:53.037590  4171 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:46869
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal
--rpc_trace_negotiation=true
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:44643
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.4171
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:53.038815  4171 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:53.040081  4171 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:53.047220  4179 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:53.047204  4176 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:53.047186  4177 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:53.047565  4171 server_base.cc:1061] running on GCE node
I20260504 14:08:53.048055  4171 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:53.048691  4171 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:53.049881  4171 hybrid_clock.cc:648] HybridClock initialized: now 1777903733049852 us; error 52 us; skew 500 ppm
I20260504 14:08:53.052228  4171 webserver.cc:492] Webserver started at http://127.25.254.193:42775/ using document root <none> and password file <none>
I20260504 14:08:53.052850  4171 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:53.052910  4171 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:53.053146  4171 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:53.055011  4171 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data/instance:
uuid: "4c9c5bdf89df472f850763590bf6d233"
format_stamp: "Formatted at 2026-05-04 14:08:53 on dist-test-slave-2x32"
server_key: "4d2260ce3c80a71d6528ca9ff9618ffb"
server_key_iv: "e99d5f47d5d456d559987ce97a7423e9"
server_key_version: "encryptionkey@0"
I20260504 14:08:53.055511  4171 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance:
uuid: "4c9c5bdf89df472f850763590bf6d233"
format_stamp: "Formatted at 2026-05-04 14:08:53 on dist-test-slave-2x32"
server_key: "4d2260ce3c80a71d6528ca9ff9618ffb"
server_key_iv: "e99d5f47d5d456d559987ce97a7423e9"
server_key_version: "encryptionkey@0"
I20260504 14:08:53.059140  4171 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.004s	sys 0.000s
I20260504 14:08:53.061867  4185 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:53.063093  4171 fs_manager.cc:730] Time spent opening block manager: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:08:53.063252  4171 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "4c9c5bdf89df472f850763590bf6d233"
format_stamp: "Formatted at 2026-05-04 14:08:53 on dist-test-slave-2x32"
server_key: "4d2260ce3c80a71d6528ca9ff9618ffb"
server_key_iv: "e99d5f47d5d456d559987ce97a7423e9"
server_key_version: "encryptionkey@0"
I20260504 14:08:53.063375  4171 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:53.086575  4171 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:53.087412  4171 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:53.087653  4171 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:53.088308  4171 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:53.089437  4171 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:53.089525  4171 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:53.089602  4171 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:53.089655  4171 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:53.099628  4171 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:38931
I20260504 14:08:53.099650  4298 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:38931 every 8 connection(s)
I20260504 14:08:53.100684  4171 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
I20260504 14:08:53.106503 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 4171
I20260504 14:08:53.106648 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance
I20260504 14:08:53.106933 26619 external_mini_cluster.cc:1468] Setting key 67084ae416aa8d374f02e0b5d34ba5d1
I20260504 14:08:53.109295 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:0
--local_ip_for_outbound_sockets=127.25.254.194
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:44643
--builtin_ntp_servers=127.25.254.212:46869
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {}
I20260504 14:08:53.109479  4150 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:53.102625 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:49679 (local address 127.25.254.254:44643)
0504 14:08:53.102856 (+   231us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:53.102860 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:53.103714 (+   854us) server_negotiation.cc:408] Connection header received
0504 14:08:53.104565 (+   851us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:53.104568 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:53.104627 (+    59us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:53.104710 (+    83us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:53.105639 (+   929us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:53.106372 (+   733us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:53.107278 (+   906us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:53.107499 (+   221us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:53.108905 (+  1406us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:53.108921 (+    16us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:53.108923 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:08:53.108936 (+    13us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:53.108955 (+    19us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:08:53.108962 (+     7us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:53.109034 (+    72us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:53.109255 (+   221us) server_negotiation.cc:300] Negotiation successful
0504 14:08:53.109306 (+    51us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":91}
I20260504 14:08:53.110329  4301 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:53.102930 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:44643 (local address 127.25.254.193:49679)
0504 14:08:53.103553 (+   623us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:53.103585 (+    32us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:53.104346 (+   761us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:53.104869 (+   523us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:53.104878 (+     9us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:53.104937 (+    59us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:53.105439 (+   502us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:53.105450 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:53.106520 (+  1070us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:53.106527 (+     7us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:53.107134 (+   607us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:53.107145 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:53.107374 (+   229us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:53.108652 (+  1278us) client_negotiation.cc:624] Initiating SASL PLAIN handshake
0504 14:08:53.108654 (+     2us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:53.108687 (+    33us) client_negotiation.cc:815] callback for SASL_CB_AUTHNAME
0504 14:08:53.108750 (+    63us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:53.109069 (+   319us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:53.109080 (+    11us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:53.109089 (+     9us) client_negotiation.cc:770] Sending connection context
0504 14:08:53.109259 (+   170us) client_negotiation.cc:241] Negotiation successful
0504 14:08:53.109430 (+   171us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":277,"thread_start_us":99,"threads_started":1}
I20260504 14:08:53.111603  4299 heartbeater.cc:344] Connected to a master server at 127.25.254.254:44643
I20260504 14:08:53.111869  4299 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:53.112445  4299 heartbeater.cc:507] Master 127.25.254.254:44643 requested a full tablet report, sending...
I20260504 14:08:53.114236  4111 ts_manager.cc:194] Registered new tserver with Master: 4c9c5bdf89df472f850763590bf6d233 (127.25.254.193:38931)
I20260504 14:08:53.116044  4111 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.25.254.193:49679
W20260504 14:08:53.216295  4302 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:53.216524  4302 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:53.216572  4302 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:53.220041  4302 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:53.220113  4302 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:53.220186  4302 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:08:53.224432  4302 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:46869
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal
--rpc_trace_negotiation=true
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:44643
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.4302
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:53.225642  4302 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:53.226874  4302 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:53.233944  4310 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:53.234009  4308 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:53.234061  4307 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:53.234244  4302 server_base.cc:1061] running on GCE node
I20260504 14:08:53.234752  4302 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:53.235440  4302 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:53.236645  4302 hybrid_clock.cc:648] HybridClock initialized: now 1777903733236630 us; error 28 us; skew 500 ppm
I20260504 14:08:53.238770  4302 webserver.cc:492] Webserver started at http://127.25.254.194:46573/ using document root <none> and password file <none>
I20260504 14:08:53.239426  4302 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:53.239523  4302 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:53.239782  4302 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:53.241546  4302 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data/instance:
uuid: "53d7cb23f63c435ebff09fcb4edeb962"
format_stamp: "Formatted at 2026-05-04 14:08:53 on dist-test-slave-2x32"
server_key: "d7b529c4a4038f8a9128f848d624918d"
server_key_iv: "aed202c03b7a9a042dd9c9e6b1100a9c"
server_key_version: "encryptionkey@0"
I20260504 14:08:53.242079  4302 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance:
uuid: "53d7cb23f63c435ebff09fcb4edeb962"
format_stamp: "Formatted at 2026-05-04 14:08:53 on dist-test-slave-2x32"
server_key: "d7b529c4a4038f8a9128f848d624918d"
server_key_iv: "aed202c03b7a9a042dd9c9e6b1100a9c"
server_key_version: "encryptionkey@0"
I20260504 14:08:53.245740  4302 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:08:53.248286  4316 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:53.249570  4302 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.000s	sys 0.002s
I20260504 14:08:53.249711  4302 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "53d7cb23f63c435ebff09fcb4edeb962"
format_stamp: "Formatted at 2026-05-04 14:08:53 on dist-test-slave-2x32"
server_key: "d7b529c4a4038f8a9128f848d624918d"
server_key_iv: "aed202c03b7a9a042dd9c9e6b1100a9c"
server_key_version: "encryptionkey@0"
I20260504 14:08:53.249836  4302 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:53.266242  4302 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:53.266933  4302 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:53.267138  4302 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:53.267812  4302 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:53.268927  4302 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:53.269006  4302 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:53.269078  4302 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:53.269135  4302 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:53.280943  4302 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:39713
I20260504 14:08:53.280963  4429 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:39713 every 8 connection(s)
I20260504 14:08:53.281956  4302 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
I20260504 14:08:53.286196 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 4302
I20260504 14:08:53.286339 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance
I20260504 14:08:53.286643 26619 external_mini_cluster.cc:1468] Setting key fd9f03ee8e29a5a0bb02d262fc0ebba7
I20260504 14:08:53.289222 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:0
--local_ip_for_outbound_sockets=127.25.254.195
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:44643
--builtin_ntp_servers=127.25.254.212:46869
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {}
I20260504 14:08:53.291488  4150 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:53.283840 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:43537 (local address 127.25.254.254:44643)
0504 14:08:53.284056 (+   216us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:53.284061 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:53.284976 (+   915us) server_negotiation.cc:408] Connection header received
0504 14:08:53.285877 (+   901us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:53.285881 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:53.285938 (+    57us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:53.286030 (+    92us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:53.287370 (+  1340us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:53.288058 (+   688us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:53.288982 (+   924us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:53.289158 (+   176us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:53.290834 (+  1676us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:53.290854 (+    20us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:53.290856 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:08:53.290868 (+    12us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:53.290891 (+    23us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:08:53.290898 (+     7us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:53.291063 (+   165us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:53.291269 (+   206us) server_negotiation.cc:300] Negotiation successful
0504 14:08:53.291339 (+    70us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":72}
I20260504 14:08:53.292376  4432 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:53.284186 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:44643 (local address 127.25.254.194:43537)
0504 14:08:53.284816 (+   630us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:53.284850 (+    34us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:53.285605 (+   755us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:53.286281 (+   676us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:53.286295 (+    14us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:53.286388 (+    93us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:53.287146 (+   758us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:53.287161 (+    15us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:53.288300 (+  1139us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:53.288304 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:53.288848 (+   544us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:53.288858 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:53.289147 (+   289us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:53.290545 (+  1398us) client_negotiation.cc:624] Initiating SASL PLAIN handshake
0504 14:08:53.290549 (+     4us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:53.290589 (+    40us) client_negotiation.cc:815] callback for SASL_CB_AUTHNAME
0504 14:08:53.290661 (+    72us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:53.291085 (+   424us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:53.291100 (+    15us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:53.291105 (+     5us) client_negotiation.cc:770] Sending connection context
0504 14:08:53.291263 (+   158us) client_negotiation.cc:241] Negotiation successful
0504 14:08:53.291430 (+   167us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":281,"thread_start_us":105,"threads_started":1}
I20260504 14:08:53.293794  4430 heartbeater.cc:344] Connected to a master server at 127.25.254.254:44643
I20260504 14:08:53.294121  4430 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:53.294732  4430 heartbeater.cc:507] Master 127.25.254.254:44643 requested a full tablet report, sending...
I20260504 14:08:53.295960  4111 ts_manager.cc:194] Registered new tserver with Master: 53d7cb23f63c435ebff09fcb4edeb962 (127.25.254.194:39713)
I20260504 14:08:53.296823  4111 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.25.254.194:43537
W20260504 14:08:53.396109  4433 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:53.396377  4433 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:53.396469  4433 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:53.400986  4433 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:53.401062  4433 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:53.401172  4433 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:08:53.405306  4433 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:46869
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal
--rpc_trace_negotiation=true
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:44643
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.4433
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:53.406490  4433 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:53.407610  4433 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:53.415132  4438 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:53.415139  4433 server_base.cc:1061] running on GCE node
W20260504 14:08:53.415129  4439 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:53.415102  4441 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:53.415750  4433 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:53.416317  4433 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:53.417479  4433 hybrid_clock.cc:648] HybridClock initialized: now 1777903733417455 us; error 38 us; skew 500 ppm
I20260504 14:08:53.419427  4433 webserver.cc:492] Webserver started at http://127.25.254.195:44657/ using document root <none> and password file <none>
I20260504 14:08:53.420048  4433 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:53.420109  4433 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:53.420367  4433 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:53.422082  4433 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data/instance:
uuid: "e399324699504b81ae9020553da30589"
format_stamp: "Formatted at 2026-05-04 14:08:53 on dist-test-slave-2x32"
server_key: "541d1707692521891d0f17e5bc8c020e"
server_key_iv: "f35f23a6490840208671fa6475a0c9a9"
server_key_version: "encryptionkey@0"
I20260504 14:08:53.422657  4433 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance:
uuid: "e399324699504b81ae9020553da30589"
format_stamp: "Formatted at 2026-05-04 14:08:53 on dist-test-slave-2x32"
server_key: "541d1707692521891d0f17e5bc8c020e"
server_key_iv: "f35f23a6490840208671fa6475a0c9a9"
server_key_version: "encryptionkey@0"
I20260504 14:08:53.426331  4433 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.003s	sys 0.001s
I20260504 14:08:53.428764  4447 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:53.429811  4433 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:08:53.429939  4433 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "e399324699504b81ae9020553da30589"
format_stamp: "Formatted at 2026-05-04 14:08:53 on dist-test-slave-2x32"
server_key: "541d1707692521891d0f17e5bc8c020e"
server_key_iv: "f35f23a6490840208671fa6475a0c9a9"
server_key_version: "encryptionkey@0"
I20260504 14:08:53.430047  4433 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:53.470417  4433 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:53.471127  4433 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:53.471329  4433 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:53.471925  4433 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:53.472954  4433 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:53.473028  4433 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:53.473094  4433 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:53.473136  4433 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:53.482406  4433 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:36693
I20260504 14:08:53.482427  4560 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:36693 every 8 connection(s)
I20260504 14:08:53.483429  4433 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
I20260504 14:08:53.486852 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 4433
I20260504 14:08:53.486991 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance
I20260504 14:08:53.487267 26619 external_mini_cluster.cc:1468] Setting key 7e373d2d430f0ba337253dcf96a62824
I20260504 14:08:53.492596  4150 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:53.485323 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:37169 (local address 127.25.254.254:44643)
0504 14:08:53.485485 (+   162us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:53.485489 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:53.486493 (+  1004us) server_negotiation.cc:408] Connection header received
0504 14:08:53.487746 (+  1253us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:53.487751 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:53.487837 (+    86us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:53.487949 (+   112us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:53.489152 (+  1203us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:53.489721 (+   569us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:53.490455 (+   734us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:53.490642 (+   187us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:53.492029 (+  1387us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:53.492063 (+    34us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:53.492066 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:08:53.492079 (+    13us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:53.492100 (+    21us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:08:53.492108 (+     8us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:53.492191 (+    83us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:53.492382 (+   191us) server_negotiation.cc:300] Negotiation successful
0504 14:08:53.492444 (+    62us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":58}
I20260504 14:08:53.493299  4563 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:53.485607 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:44643 (local address 127.25.254.195:37169)
0504 14:08:53.486348 (+   741us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:53.486382 (+    34us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:53.487488 (+  1106us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:53.488251 (+   763us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:53.488262 (+    11us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:53.488378 (+   116us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:53.488991 (+   613us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:53.489006 (+    15us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:53.489867 (+   861us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:53.489871 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:53.490333 (+   462us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:53.490340 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:53.490585 (+   245us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:53.491738 (+  1153us) client_negotiation.cc:624] Initiating SASL PLAIN handshake
0504 14:08:53.491739 (+     1us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:53.491768 (+    29us) client_negotiation.cc:815] callback for SASL_CB_AUTHNAME
0504 14:08:53.491875 (+   107us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:53.492209 (+   334us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:53.492219 (+    10us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:53.492229 (+    10us) client_negotiation.cc:770] Sending connection context
0504 14:08:53.492378 (+   149us) client_negotiation.cc:241] Negotiation successful
0504 14:08:53.492541 (+   163us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":297,"thread_start_us":109,"threads_started":1}
I20260504 14:08:53.494488  4561 heartbeater.cc:344] Connected to a master server at 127.25.254.254:44643
I20260504 14:08:53.494839  4561 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:53.495489  4561 heartbeater.cc:507] Master 127.25.254.254:44643 requested a full tablet report, sending...
I20260504 14:08:53.496644  4111 ts_manager.cc:194] Registered new tserver with Master: e399324699504b81ae9020553da30589 (127.25.254.195:36693)
I20260504 14:08:53.497272  4111 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.25.254.195:37169
I20260504 14:08:53.501906 26619 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
W20260504 14:08:53.505074  4150 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:53.504005 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:53130 (local address 127.25.254.254:44643)
0504 14:08:53.504183 (+   178us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:53.504188 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:53.504314 (+   126us) server_negotiation.cc:408] Connection header received
0504 14:08:53.504417 (+   103us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:53.504419 (+     2us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:53.504466 (+    47us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:53.504538 (+    72us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:53.504957 (+   419us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.0.0.1:53130: BlockingRecv error: recv got EOF from 127.0.0.1:53130 (error 108)
Metrics: {"server-negotiator.queue_time_us":81}
I20260504 14:08:53.505246 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 4171
I20260504 14:08:53.511602 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 4302
I20260504 14:08:53.517799 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 4433
I20260504 14:08:53.524215 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 4080
2026-05-04T14:08:53Z chronyd exiting
[       OK ] SecurityITest.TestRequireAuthenticationInsecureCluster (898 ms)
[ RUN      ] SecurityITest.TestRequireEncryptionInsecureCluster
2026-05-04T14:08:53Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:08:53Z Disabled control of system clock
I20260504 14:08:53.554339 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:45643
--webserver_interface=127.25.254.254
--webserver_port=0
--builtin_ntp_servers=127.25.254.212:35885
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:45643
--encrypt_data_at_rest=true
--rpc_trace_negotiation
--rpc_encryption=disabled
--rpc_authentication=disabled with env {}
W20260504 14:08:53.665853  4572 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:53.666115  4572 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:53.666246  4572 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:53.669941  4572 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:08:53.670018  4572 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:53.670035  4572 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:53.670056  4572 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:08:53.670083  4572 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:08:53.674585  4572 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:35885
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:45643
--rpc_trace_negotiation=true
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:45643
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=disabled
--rpc_encryption=disabled
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.4572
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:53.675710  4572 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:53.676890  4572 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:53.683944  4578 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:53.683930  4580 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:53.683934  4577 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:53.684350  4572 server_base.cc:1061] running on GCE node
I20260504 14:08:53.685038  4572 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:53.686050  4572 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:53.687261  4572 hybrid_clock.cc:648] HybridClock initialized: now 1777903733687226 us; error 52 us; skew 500 ppm
I20260504 14:08:53.689538  4572 webserver.cc:492] Webserver started at http://127.25.254.254:36165/ using document root <none> and password file <none>
I20260504 14:08:53.690337  4572 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:53.690423  4572 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:53.690637  4572 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:53.692410  4572 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "e9a073a061f34d7bbf76a21511a1ad76"
format_stamp: "Formatted at 2026-05-04 14:08:53 on dist-test-slave-2x32"
server_key: "6a5449308d4bf8062ed64c7e365bbed0"
server_key_iv: "270ba5936d0449599bbe0dfbf445740e"
server_key_version: "encryptionkey@0"
I20260504 14:08:53.692950  4572 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "e9a073a061f34d7bbf76a21511a1ad76"
format_stamp: "Formatted at 2026-05-04 14:08:53 on dist-test-slave-2x32"
server_key: "6a5449308d4bf8062ed64c7e365bbed0"
server_key_iv: "270ba5936d0449599bbe0dfbf445740e"
server_key_version: "encryptionkey@0"
I20260504 14:08:53.696605  4572 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.003s	sys 0.002s
I20260504 14:08:53.699259  4586 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:53.700304  4572 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.000s
I20260504 14:08:53.700451  4572 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "e9a073a061f34d7bbf76a21511a1ad76"
format_stamp: "Formatted at 2026-05-04 14:08:53 on dist-test-slave-2x32"
server_key: "6a5449308d4bf8062ed64c7e365bbed0"
server_key_iv: "270ba5936d0449599bbe0dfbf445740e"
server_key_version: "encryptionkey@0"
I20260504 14:08:53.700580  4572 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:53.705574  4572 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:53.706290  4572 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:53.706669  4572 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:53.714619  4638 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:45643 every 8 connection(s)
I20260504 14:08:53.714596  4572 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:45643
I20260504 14:08:53.716032  4572 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:08:53.719165  4639 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:53.720292 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 4572
I20260504 14:08:53.720445 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:08:53.720834 26619 external_mini_cluster.cc:1468] Setting key 407e631aa761d22c04fc66541c7194fa
I20260504 14:08:53.726104  4639 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76: Bootstrap starting.
I20260504 14:08:53.728370  4642 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:53.722376 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:50258 (local address 127.25.254.254:45643)
0504 14:08:53.723799 (+  1423us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:53.723821 (+    22us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:53.723865 (+    44us) server_negotiation.cc:408] Connection header received
0504 14:08:53.724734 (+   869us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:53.724767 (+    33us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:53.725211 (+   444us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:53.725599 (+   388us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:53.727020 (+  1421us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:53.727025 (+     5us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:53.727037 (+    12us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:08:53.727048 (+    11us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:53.727087 (+    39us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:08:53.727097 (+    10us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:53.727242 (+   145us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:53.727589 (+   347us) server_negotiation.cc:300] Negotiation successful
0504 14:08:53.727737 (+   148us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":1036,"thread_start_us":699,"threads_started":1}
I20260504 14:08:53.729077  4639 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:53.729851  4639 log.cc:826] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:53.731868  4639 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76: No bootstrap required, opened a new log
I20260504 14:08:53.734726  4639 raft_consensus.cc:359] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "e9a073a061f34d7bbf76a21511a1ad76" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 45643 } }
I20260504 14:08:53.734946  4639 raft_consensus.cc:385] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:53.735018  4639 raft_consensus.cc:740] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: e9a073a061f34d7bbf76a21511a1ad76, State: Initialized, Role: FOLLOWER
I20260504 14:08:53.735468  4639 consensus_queue.cc:260] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "e9a073a061f34d7bbf76a21511a1ad76" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 45643 } }
I20260504 14:08:53.735594  4639 raft_consensus.cc:399] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:08:53.735663  4639 raft_consensus.cc:493] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:08:53.735765  4639 raft_consensus.cc:3060] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:53.736701  4639 raft_consensus.cc:515] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "e9a073a061f34d7bbf76a21511a1ad76" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 45643 } }
I20260504 14:08:53.737183  4639 leader_election.cc:304] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: e9a073a061f34d7bbf76a21511a1ad76; no voters: 
I20260504 14:08:53.737555  4639 leader_election.cc:290] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:08:53.737655  4644 raft_consensus.cc:2804] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:08:53.737851  4644 raft_consensus.cc:697] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76 [term 1 LEADER]: Becoming Leader. State: Replica: e9a073a061f34d7bbf76a21511a1ad76, State: Running, Role: LEADER
I20260504 14:08:53.738432  4644 consensus_queue.cc:237] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "e9a073a061f34d7bbf76a21511a1ad76" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 45643 } }
I20260504 14:08:53.739215  4639 sys_catalog.cc:565] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76 [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:08:53.740815  4645 sys_catalog.cc:455] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "e9a073a061f34d7bbf76a21511a1ad76" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "e9a073a061f34d7bbf76a21511a1ad76" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 45643 } } }
I20260504 14:08:53.740967  4645 sys_catalog.cc:458] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76 [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:53.741026  4646 sys_catalog.cc:455] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76 [sys.catalog]: SysCatalogTable state changed. Reason: New leader e9a073a061f34d7bbf76a21511a1ad76. Latest consensus state: current_term: 1 leader_uuid: "e9a073a061f34d7bbf76a21511a1ad76" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "e9a073a061f34d7bbf76a21511a1ad76" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 45643 } } }
I20260504 14:08:53.741138  4646 sys_catalog.cc:458] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76 [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:53.741907  4653 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:08:53.744728  4653 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:08:53.751262  4653 catalog_manager.cc:1357] Generated new cluster ID: 53e4e781419b43b0b8cc804a6507d074
I20260504 14:08:53.751362  4653 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:08:53.762197  4653 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:08:53.762573  4653 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:08:53.774773  4653 catalog_manager.cc:6044] T 00000000000000000000000000000000 P e9a073a061f34d7bbf76a21511a1ad76: Generated new TSK 0
I20260504 14:08:53.775626  4653 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20260504 14:08:53.789580 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:0
--local_ip_for_outbound_sockets=127.25.254.193
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:45643
--builtin_ntp_servers=127.25.254.212:35885
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--rpc_encryption=disabled
--rpc_authentication=disabled with env {}
W20260504 14:08:53.903779  4663 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:53.904063  4663 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:53.904174  4663 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:53.908183  4663 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:53.908303  4663 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:53.908423  4663 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:08:53.912593  4663 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:35885
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal
--rpc_trace_negotiation=true
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=disabled
--rpc_encryption=disabled
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:45643
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.4663
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:53.913728  4663 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:53.914932  4663 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:53.921772  4669 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:53.921913  4663 server_base.cc:1061] running on GCE node
W20260504 14:08:53.921777  4671 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:53.921772  4668 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:53.922484  4663 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:53.923151  4663 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:53.924355  4663 hybrid_clock.cc:648] HybridClock initialized: now 1777903733924324 us; error 52 us; skew 500 ppm
I20260504 14:08:53.926586  4663 webserver.cc:492] Webserver started at http://127.25.254.193:32885/ using document root <none> and password file <none>
I20260504 14:08:53.927235  4663 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:53.927295  4663 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:53.927508  4663 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:53.929363  4663 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data/instance:
uuid: "29389320034a4b29a08c53ee62e67f2b"
format_stamp: "Formatted at 2026-05-04 14:08:53 on dist-test-slave-2x32"
server_key: "371c2517d8705fdb62c68eae97a39b05"
server_key_iv: "a3926247373d03a35b7c0c756970fecd"
server_key_version: "encryptionkey@0"
I20260504 14:08:53.929886  4663 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance:
uuid: "29389320034a4b29a08c53ee62e67f2b"
format_stamp: "Formatted at 2026-05-04 14:08:53 on dist-test-slave-2x32"
server_key: "371c2517d8705fdb62c68eae97a39b05"
server_key_iv: "a3926247373d03a35b7c0c756970fecd"
server_key_version: "encryptionkey@0"
I20260504 14:08:53.933385  4663 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:08:53.936040  4677 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:53.937239  4663 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:08:53.937395  4663 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "29389320034a4b29a08c53ee62e67f2b"
format_stamp: "Formatted at 2026-05-04 14:08:53 on dist-test-slave-2x32"
server_key: "371c2517d8705fdb62c68eae97a39b05"
server_key_iv: "a3926247373d03a35b7c0c756970fecd"
server_key_version: "encryptionkey@0"
I20260504 14:08:53.937515  4663 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:53.942754  4663 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:53.943408  4663 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:53.943612  4663 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:53.944177  4663 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:53.945200  4663 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:53.945274  4663 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:53.945346  4663 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:53.945397  4663 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:53.957479  4663 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:36471
I20260504 14:08:53.957489  4790 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:36471 every 8 connection(s)
I20260504 14:08:53.958657  4663 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
I20260504 14:08:53.964346  4642 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:53.960625 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:59801 (local address 127.25.254.254:45643)
0504 14:08:53.960808 (+   183us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:53.960812 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:53.961676 (+   864us) server_negotiation.cc:408] Connection header received
0504 14:08:53.962626 (+   950us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:53.962630 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:53.962681 (+    51us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:53.962771 (+    90us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:53.963840 (+  1069us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:53.963845 (+     5us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:53.963848 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:08:53.963861 (+    13us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:53.963887 (+    26us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:08:53.963897 (+    10us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:53.963996 (+    99us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:53.964164 (+   168us) server_negotiation.cc:300] Negotiation successful
0504 14:08:53.964202 (+    38us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":73}
I20260504 14:08:53.964972  4793 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:53.960893 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:45643 (local address 127.25.254.193:59801)
0504 14:08:53.961517 (+   624us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:53.961549 (+    32us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:53.962421 (+   872us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:53.962942 (+   521us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:53.962950 (+     8us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:53.963017 (+    67us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:53.963609 (+   592us) client_negotiation.cc:624] Initiating SASL PLAIN handshake
0504 14:08:53.963611 (+     2us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:53.963637 (+    26us) client_negotiation.cc:815] callback for SASL_CB_AUTHNAME
0504 14:08:53.963688 (+    51us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:53.963984 (+   296us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:53.963991 (+     7us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:53.963997 (+     6us) client_negotiation.cc:770] Sending connection context
0504 14:08:53.964200 (+   203us) client_negotiation.cc:241] Negotiation successful
0504 14:08:53.964307 (+   107us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":284,"thread_start_us":120,"threads_started":1}
I20260504 14:08:53.966317  4791 heartbeater.cc:344] Connected to a master server at 127.25.254.254:45643
I20260504 14:08:53.966560 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 4663
I20260504 14:08:53.966667 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance
I20260504 14:08:53.966727  4791 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:53.966958 26619 external_mini_cluster.cc:1468] Setting key 1d360f3df25a75f148eca484bd89b12f
I20260504 14:08:53.967463  4791 heartbeater.cc:507] Master 127.25.254.254:45643 requested a full tablet report, sending...
I20260504 14:08:53.969161 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:0
--local_ip_for_outbound_sockets=127.25.254.194
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:45643
--builtin_ntp_servers=127.25.254.212:35885
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--rpc_encryption=disabled
--rpc_authentication=disabled with env {}
I20260504 14:08:53.969247  4602 ts_manager.cc:194] Registered new tserver with Master: 29389320034a4b29a08c53ee62e67f2b (127.25.254.193:36471)
W20260504 14:08:54.076613  4794 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:54.076946  4794 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:54.077034  4794 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:54.080708  4794 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:54.080826  4794 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:54.080940  4794 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:08:54.085144  4794 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:35885
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal
--rpc_trace_negotiation=true
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=disabled
--rpc_encryption=disabled
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:45643
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.4794
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:54.086393  4794 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:54.087600  4794 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:54.094731  4800 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:54.094877  4802 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:54.094813  4799 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:54.095055  4794 server_base.cc:1061] running on GCE node
I20260504 14:08:54.095597  4794 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:54.096254  4794 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:54.097457  4794 hybrid_clock.cc:648] HybridClock initialized: now 1777903734097430 us; error 40 us; skew 500 ppm
I20260504 14:08:54.099817  4794 webserver.cc:492] Webserver started at http://127.25.254.194:37133/ using document root <none> and password file <none>
I20260504 14:08:54.100455  4794 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:54.100519  4794 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:54.100679  4794 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:54.102489  4794 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data/instance:
uuid: "02265749f10043259c2438995604749e"
format_stamp: "Formatted at 2026-05-04 14:08:54 on dist-test-slave-2x32"
server_key: "09d06c3046403944bb3643a66680b89d"
server_key_iv: "6375bb980cb57ffec508ef4bee2f2ccb"
server_key_version: "encryptionkey@0"
I20260504 14:08:54.102942  4794 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance:
uuid: "02265749f10043259c2438995604749e"
format_stamp: "Formatted at 2026-05-04 14:08:54 on dist-test-slave-2x32"
server_key: "09d06c3046403944bb3643a66680b89d"
server_key_iv: "6375bb980cb57ffec508ef4bee2f2ccb"
server_key_version: "encryptionkey@0"
I20260504 14:08:54.106366  4794 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.003s	sys 0.001s
I20260504 14:08:54.108764  4808 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:54.109820  4794 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:08:54.109923  4794 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "02265749f10043259c2438995604749e"
format_stamp: "Formatted at 2026-05-04 14:08:54 on dist-test-slave-2x32"
server_key: "09d06c3046403944bb3643a66680b89d"
server_key_iv: "6375bb980cb57ffec508ef4bee2f2ccb"
server_key_version: "encryptionkey@0"
I20260504 14:08:54.110037  4794 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:54.114746  4794 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:54.115298  4794 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:54.115438  4794 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:54.115918  4794 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:54.116848  4794 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:54.116894  4794 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:54.116933  4794 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:54.116948  4794 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:54.126561  4794 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:38383
I20260504 14:08:54.126593  4921 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:38383 every 8 connection(s)
I20260504 14:08:54.127610  4794 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
I20260504 14:08:54.133293  4642 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:54.129538 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:54083 (local address 127.25.254.254:45643)
0504 14:08:54.129705 (+   167us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:54.129708 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:54.130711 (+  1003us) server_negotiation.cc:408] Connection header received
0504 14:08:54.131569 (+   858us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:54.131573 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:54.131625 (+    52us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:54.131735 (+   110us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:54.132782 (+  1047us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:54.132786 (+     4us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:54.132789 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:08:54.132801 (+    12us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:54.132838 (+    37us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:08:54.132846 (+     8us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:54.132943 (+    97us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:54.133110 (+   167us) server_negotiation.cc:300] Negotiation successful
0504 14:08:54.133147 (+    37us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":70}
I20260504 14:08:54.133877  4924 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:54.129823 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:45643 (local address 127.25.254.194:54083)
0504 14:08:54.130572 (+   749us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:54.130605 (+    33us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:54.131366 (+   761us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:54.131909 (+   543us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:54.131917 (+     8us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:54.131980 (+    63us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:54.132559 (+   579us) client_negotiation.cc:624] Initiating SASL PLAIN handshake
0504 14:08:54.132561 (+     2us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:54.132586 (+    25us) client_negotiation.cc:815] callback for SASL_CB_AUTHNAME
0504 14:08:54.132632 (+    46us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:54.132938 (+   306us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:54.132945 (+     7us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:54.132953 (+     8us) client_negotiation.cc:770] Sending connection context
0504 14:08:54.133159 (+   206us) client_negotiation.cc:241] Negotiation successful
0504 14:08:54.133271 (+   112us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":257,"thread_start_us":103,"threads_started":1}
I20260504 14:08:54.135107  4922 heartbeater.cc:344] Connected to a master server at 127.25.254.254:45643
I20260504 14:08:54.135361  4922 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:54.135766 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 4794
I20260504 14:08:54.135800  4922 heartbeater.cc:507] Master 127.25.254.254:45643 requested a full tablet report, sending...
I20260504 14:08:54.135901 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance
I20260504 14:08:54.136176 26619 external_mini_cluster.cc:1468] Setting key 23fa461a6c6a136e911c698c4caa92b7
I20260504 14:08:54.136956  4602 ts_manager.cc:194] Registered new tserver with Master: 02265749f10043259c2438995604749e (127.25.254.194:38383)
I20260504 14:08:54.138384 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:0
--local_ip_for_outbound_sockets=127.25.254.195
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:45643
--builtin_ntp_servers=127.25.254.212:35885
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation
--rpc_encryption=disabled
--rpc_authentication=disabled with env {}
W20260504 14:08:54.242470  4925 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:54.242734  4925 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:54.242817  4925 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:54.246361  4925 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:54.246448  4925 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:54.246559  4925 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:08:54.250784  4925 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:35885
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal
--rpc_trace_negotiation=true
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=disabled
--rpc_encryption=disabled
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:45643
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.4925
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:54.251924  4925 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:54.253041  4925 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:54.259894  4933 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:54.259899  4930 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:54.259992  4931 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:54.260310  4925 server_base.cc:1061] running on GCE node
I20260504 14:08:54.260720  4925 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:54.261456  4925 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:54.262642  4925 hybrid_clock.cc:648] HybridClock initialized: now 1777903734262601 us; error 54 us; skew 500 ppm
I20260504 14:08:54.264721  4925 webserver.cc:492] Webserver started at http://127.25.254.195:46599/ using document root <none> and password file <none>
I20260504 14:08:54.265326  4925 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:54.265386  4925 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:54.265632  4925 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:54.267447  4925 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data/instance:
uuid: "71e6309c800841b680b784c0e51aaba0"
format_stamp: "Formatted at 2026-05-04 14:08:54 on dist-test-slave-2x32"
server_key: "5c2484772eb47643c20ce96918b41703"
server_key_iv: "ef915a3eb003f843dfd93cc08d669ee9"
server_key_version: "encryptionkey@0"
I20260504 14:08:54.267951  4925 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance:
uuid: "71e6309c800841b680b784c0e51aaba0"
format_stamp: "Formatted at 2026-05-04 14:08:54 on dist-test-slave-2x32"
server_key: "5c2484772eb47643c20ce96918b41703"
server_key_iv: "ef915a3eb003f843dfd93cc08d669ee9"
server_key_version: "encryptionkey@0"
I20260504 14:08:54.271517  4925 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.003s	sys 0.002s
I20260504 14:08:54.273909  4939 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:54.275072  4925 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.001s
I20260504 14:08:54.275163  4925 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "71e6309c800841b680b784c0e51aaba0"
format_stamp: "Formatted at 2026-05-04 14:08:54 on dist-test-slave-2x32"
server_key: "5c2484772eb47643c20ce96918b41703"
server_key_iv: "ef915a3eb003f843dfd93cc08d669ee9"
server_key_version: "encryptionkey@0"
I20260504 14:08:54.275245  4925 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:54.279866  4925 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:54.280411  4925 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:54.280548  4925 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:54.281373  4925 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:54.282423  4925 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:54.282475  4925 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:54.282516  4925 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:54.282531  4925 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:54.291996  4925 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:37367
I20260504 14:08:54.292021  5052 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:37367 every 8 connection(s)
I20260504 14:08:54.292979  4925 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
I20260504 14:08:54.294198 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 4925
I20260504 14:08:54.294322 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireEncryptionInsecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance
I20260504 14:08:54.294596 26619 external_mini_cluster.cc:1468] Setting key 760eae5d049e5c69e826c343329e3d29
I20260504 14:08:54.298728  4642 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:54.294997 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:53749 (local address 127.25.254.254:45643)
0504 14:08:54.295170 (+   173us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:54.295174 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:54.296164 (+   990us) server_negotiation.cc:408] Connection header received
0504 14:08:54.297144 (+   980us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:54.297147 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:54.297186 (+    39us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:54.297253 (+    67us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:54.298355 (+  1102us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:54.298358 (+     3us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:54.298360 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: PLAIN
0504 14:08:54.298370 (+    10us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:54.298389 (+    19us) server_negotiation.cc:1092] Received PLAIN auth, user=slave
0504 14:08:54.298396 (+     7us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:54.298469 (+    73us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:54.298596 (+   127us) server_negotiation.cc:300] Negotiation successful
0504 14:08:54.298624 (+    28us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":65}
I20260504 14:08:54.299353  5055 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:54.295377 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:45643 (local address 127.25.254.195:53749)
0504 14:08:54.296036 (+   659us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:54.296079 (+    43us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:54.296933 (+   854us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:54.297407 (+   474us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:54.297416 (+     9us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:54.297479 (+    63us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:54.298089 (+   610us) client_negotiation.cc:624] Initiating SASL PLAIN handshake
0504 14:08:54.298090 (+     1us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:54.298114 (+    24us) client_negotiation.cc:815] callback for SASL_CB_AUTHNAME
0504 14:08:54.298221 (+   107us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:54.298472 (+   251us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:54.298477 (+     5us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:54.298481 (+     4us) client_negotiation.cc:770] Sending connection context
0504 14:08:54.298619 (+   138us) client_negotiation.cc:241] Negotiation successful
0504 14:08:54.298704 (+    85us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":265,"thread_start_us":96,"threads_started":1}
I20260504 14:08:54.300431  5053 heartbeater.cc:344] Connected to a master server at 127.25.254.254:45643
I20260504 14:08:54.300642  5053 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:54.301011  5053 heartbeater.cc:507] Master 127.25.254.254:45643 requested a full tablet report, sending...
I20260504 14:08:54.302021  4603 ts_manager.cc:194] Registered new tserver with Master: 71e6309c800841b680b784c0e51aaba0 (127.25.254.195:37367)
I20260504 14:08:54.309192 26619 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
W20260504 14:08:54.311964  4642 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:08:54.311257 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:50270 (local address 127.25.254.254:45643)
0504 14:08:54.311395 (+   138us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:54.311398 (+     3us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:54.311412 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:08:54.311567 (+   155us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:54.311570 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:54.311612 (+    42us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:54.311685 (+    73us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:54.311859 (+   174us) negotiation.cc:326] Negotiation complete: Network error: Server connection negotiation failed: server connection from 127.0.0.1:50270: BlockingRecv error: recv got EOF from 127.0.0.1:50270 (error 108)
Metrics: {"server-negotiator.queue_time_us":49}
I20260504 14:08:54.312332 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 4663
I20260504 14:08:54.317950 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 4794
I20260504 14:08:54.324290 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 4925
I20260504 14:08:54.330225 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 4572
2026-05-04T14:08:54Z chronyd exiting
[       OK ] SecurityITest.TestRequireEncryptionInsecureCluster (804 ms)
[ RUN      ] SecurityITest.TestRequireAuthenticationSecureCluster
Loading random data
Initializing database '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/principal' for realm 'KRBTEST.COM',
master key name 'K/M@KRBTEST.COM'
May 04 14:08:54 dist-test-slave-2x32 krb5kdc[5063](info): setting up network...
krb5kdc: setsockopt(10,IPV6_V6ONLY,1) worked
May 04 14:08:54 dist-test-slave-2x32 krb5kdc[5063](info): set up 2 sockets
May 04 14:08:54 dist-test-slave-2x32 krb5kdc[5063](info): commencing operation
krb5kdc: starting...
W20260504 14:08:56.391011 26619 mini_kdc.cc:121] Time spent starting KDC: real 2.039s	user 0.000s	sys 0.006s
WARNING: no policy specified for test-admin@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-admin@KRBTEST.COM" created.
WARNING: no policy specified for test-user@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-user@KRBTEST.COM" created.
WARNING: no policy specified for joe-interloper@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "joe-interloper@KRBTEST.COM" created.
Authenticating as principal slave/admin@KRBTEST.COM with password.
Entry for principal test-user with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/test-user.keytab.
Entry for principal test-user with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/test-user.keytab.
May 04 14:08:56 dist-test-slave-2x32 krb5kdc[5063](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903736, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
2026-05-04T14:08:56Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:08:56Z Disabled control of system clock
WARNING: no policy specified for kudu/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:08:56.550887 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:38033
--webserver_interface=127.25.254.254
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:34957
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:38033
--encrypt_data_at_rest=true
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:56.660725  5079 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:56.661131  5079 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:56.661196  5079 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:56.665063  5079 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:08:56.665149  5079 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:56.665174  5079 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:56.665194  5079 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:08:56.665212  5079 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:08:56.669845  5079 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:34957
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:38033
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:38033
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.5079
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:56.671038  5079 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:56.672092  5079 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:56.678429  5087 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:56.678429  5085 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:56.678428  5084 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:56.678715  5079 server_base.cc:1061] running on GCE node
I20260504 14:08:56.679396  5079 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:56.680373  5079 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:56.681591  5079 hybrid_clock.cc:648] HybridClock initialized: now 1777903736681560 us; error 55 us; skew 500 ppm
May 04 14:08:56 dist-test-slave-2x32 krb5kdc[5063](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903736, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:56.684639  5079 init.cc:377] Logged in from keytab as kudu/127.25.254.254@KRBTEST.COM (short username kudu)
I20260504 14:08:56.685698  5079 webserver.cc:492] Webserver started at http://127.25.254.254:44035/ using document root <none> and password file <none>
I20260504 14:08:56.686303  5079 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:56.686358  5079 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:56.686555  5079 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:56.688303  5079 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "bced43f8aec3497d98113569c8ba0799"
format_stamp: "Formatted at 2026-05-04 14:08:56 on dist-test-slave-2x32"
server_key: "b614a912d43649839175375ed695b4d1"
server_key_iv: "14832b3917b42c7536108d51396cd14c"
server_key_version: "encryptionkey@0"
I20260504 14:08:56.688803  5079 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "bced43f8aec3497d98113569c8ba0799"
format_stamp: "Formatted at 2026-05-04 14:08:56 on dist-test-slave-2x32"
server_key: "b614a912d43649839175375ed695b4d1"
server_key_iv: "14832b3917b42c7536108d51396cd14c"
server_key_version: "encryptionkey@0"
I20260504 14:08:56.692260  5079 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.005s	sys 0.000s
I20260504 14:08:56.694509  5094 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:56.695516  5079 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.002s	sys 0.000s
I20260504 14:08:56.695647  5079 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "bced43f8aec3497d98113569c8ba0799"
format_stamp: "Formatted at 2026-05-04 14:08:56 on dist-test-slave-2x32"
server_key: "b614a912d43649839175375ed695b4d1"
server_key_iv: "14832b3917b42c7536108d51396cd14c"
server_key_version: "encryptionkey@0"
I20260504 14:08:56.695747  5079 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:56.709538  5079 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:56.716979  5079 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:56.717214  5079 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:56.724995  5079 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:38033
I20260504 14:08:56.725068  5146 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:38033 every 8 connection(s)
I20260504 14:08:56.726092  5079 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:08:56.727506 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 5079
I20260504 14:08:56.727640 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:08:56.727969 26619 external_mini_cluster.cc:1468] Setting key 9c3e8338fe1c63a9bb5f1d74fcbf9efb
I20260504 14:08:56.729501  5147 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:56.736007  5147 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799: Bootstrap starting.
May 04 14:08:56 dist-test-slave-2x32 krb5kdc[5063](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903736, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:08:56.738900  5147 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:56.740037  5147 log.cc:826] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:56.742635  5147 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799: No bootstrap required, opened a new log
I20260504 14:08:56.742726  5150 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:56.729361 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:38852 (local address 127.25.254.254:38033)
0504 14:08:56.729848 (+   487us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:56.729858 (+    10us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:56.729892 (+    34us) server_negotiation.cc:408] Connection header received
0504 14:08:56.730640 (+   748us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:56.730659 (+    19us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:56.730949 (+   290us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:56.731328 (+   379us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:56.732248 (+   920us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:56.733034 (+   786us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:56.733724 (+   690us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:56.734022 (+   298us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:56.736964 (+  2942us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:56.737032 (+    68us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:56.737066 (+    34us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:56.737100 (+    34us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:56.739258 (+  2158us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:56.739972 (+   714us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:56.739978 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:56.739984 (+     6us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:56.740097 (+   113us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:56.740454 (+   357us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:56.740458 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:56.740459 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:56.740996 (+   537us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:56.741242 (+   246us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:56.741635 (+   393us) server_negotiation.cc:300] Negotiation successful
0504 14:08:56.741960 (+   325us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":319,"thread_start_us":125,"threads_started":1}
I20260504 14:08:56.745468  5147 raft_consensus.cc:359] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "bced43f8aec3497d98113569c8ba0799" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 38033 } }
I20260504 14:08:56.745667  5147 raft_consensus.cc:385] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:56.745705  5147 raft_consensus.cc:740] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: bced43f8aec3497d98113569c8ba0799, State: Initialized, Role: FOLLOWER
I20260504 14:08:56.746222  5147 consensus_queue.cc:260] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "bced43f8aec3497d98113569c8ba0799" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 38033 } }
I20260504 14:08:56.746346  5147 raft_consensus.cc:399] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:08:56.746424  5147 raft_consensus.cc:493] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:08:56.746507  5147 raft_consensus.cc:3060] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:56.747457  5147 raft_consensus.cc:515] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "bced43f8aec3497d98113569c8ba0799" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 38033 } }
I20260504 14:08:56.747817  5147 leader_election.cc:304] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: bced43f8aec3497d98113569c8ba0799; no voters: 
I20260504 14:08:56.748134  5147 leader_election.cc:290] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:08:56.748292  5152 raft_consensus.cc:2804] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:08:56.748683  5152 raft_consensus.cc:697] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799 [term 1 LEADER]: Becoming Leader. State: Replica: bced43f8aec3497d98113569c8ba0799, State: Running, Role: LEADER
I20260504 14:08:56.749068  5152 consensus_queue.cc:237] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "bced43f8aec3497d98113569c8ba0799" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 38033 } }
I20260504 14:08:56.749351  5147 sys_catalog.cc:565] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799 [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:08:56.750932  5154 sys_catalog.cc:455] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799 [sys.catalog]: SysCatalogTable state changed. Reason: New leader bced43f8aec3497d98113569c8ba0799. Latest consensus state: current_term: 1 leader_uuid: "bced43f8aec3497d98113569c8ba0799" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "bced43f8aec3497d98113569c8ba0799" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 38033 } } }
I20260504 14:08:56.750946  5153 sys_catalog.cc:455] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "bced43f8aec3497d98113569c8ba0799" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "bced43f8aec3497d98113569c8ba0799" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 38033 } } }
I20260504 14:08:56.751092  5153 sys_catalog.cc:458] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799 [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:56.751091  5154 sys_catalog.cc:458] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799 [sys.catalog]: This master's current role is: LEADER
I20260504 14:08:56.751566  5161 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:08:56.755113  5161 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:08:56.760739  5161 catalog_manager.cc:1357] Generated new cluster ID: 0fd23b8bab804e01a12cb8a11a8c7f03
I20260504 14:08:56.760831  5161 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:08:56.798691  5161 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:08:56.799906  5161 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:08:56.807956  5161 catalog_manager.cc:6044] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799: Generated new TSK 0
I20260504 14:08:56.808763  5161 catalog_manager.cc:1524] Initializing in-progress tserver states...
WARNING: no policy specified for kudu/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:08:56.870798 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:0
--local_ip_for_outbound_sockets=127.25.254.193
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:38033
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:34957
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:56.978751  5175 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:56.978991  5175 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:56.979049  5175 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:56.982515  5175 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:56.982594  5175 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:56.982685  5175 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:08:56.987516  5175 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:34957
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:38033
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.5175
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:56.988590  5175 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:56.989413  5175 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:56.996433  5181 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:56.996300  5183 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:56.996300  5180 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:56.996595  5175 server_base.cc:1061] running on GCE node
I20260504 14:08:56.997192  5175 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:56.997833  5175 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:56.999027  5175 hybrid_clock.cc:648] HybridClock initialized: now 1777903736998976 us; error 62 us; skew 500 ppm
May 04 14:08:56 dist-test-slave-2x32 krb5kdc[5063](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903736, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:57.002362  5175 init.cc:377] Logged in from keytab as kudu/127.25.254.193@KRBTEST.COM (short username kudu)
I20260504 14:08:57.003527  5175 webserver.cc:492] Webserver started at http://127.25.254.193:36005/ using document root <none> and password file <none>
I20260504 14:08:57.004192  5175 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:57.004272  5175 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:57.004504  5175 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:57.006420  5175 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data/instance:
uuid: "6c9e29bf695849c78cbed6210429e4c9"
format_stamp: "Formatted at 2026-05-04 14:08:56 on dist-test-slave-2x32"
server_key: "216d82ed39cfebbee3e4f59d28fdb692"
server_key_iv: "763c94b14a9e6f783b97f014316f3396"
server_key_version: "encryptionkey@0"
I20260504 14:08:57.006975  5175 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance:
uuid: "6c9e29bf695849c78cbed6210429e4c9"
format_stamp: "Formatted at 2026-05-04 14:08:56 on dist-test-slave-2x32"
server_key: "216d82ed39cfebbee3e4f59d28fdb692"
server_key_iv: "763c94b14a9e6f783b97f014316f3396"
server_key_version: "encryptionkey@0"
I20260504 14:08:57.010958  5175 fs_manager.cc:696] Time spent creating directory manager: real 0.004s	user 0.003s	sys 0.001s
I20260504 14:08:57.013298  5190 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:57.014371  5175 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.003s	sys 0.001s
I20260504 14:08:57.014504  5175 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "6c9e29bf695849c78cbed6210429e4c9"
format_stamp: "Formatted at 2026-05-04 14:08:56 on dist-test-slave-2x32"
server_key: "216d82ed39cfebbee3e4f59d28fdb692"
server_key_iv: "763c94b14a9e6f783b97f014316f3396"
server_key_version: "encryptionkey@0"
I20260504 14:08:57.014618  5175 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:57.025734  5175 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:57.029130  5175 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:57.029371  5175 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:57.029992  5175 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:57.031034  5175 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:57.031111  5175 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:57.031189  5175 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:57.031239  5175 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:57.040982  5175 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:41417
I20260504 14:08:57.041005  5303 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:41417 every 8 connection(s)
I20260504 14:08:57.042016  5175 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
I20260504 14:08:57.047115 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 5175
I20260504 14:08:57.047230 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance
I20260504 14:08:57.047516 26619 external_mini_cluster.cc:1468] Setting key 0b47a8c713e5c194c9cedfb702d79cb8
May 04 14:08:57 dist-test-slave-2x32 krb5kdc[5063](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903736, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:08:57.056052  5150 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:57.044065 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:57787 (local address 127.25.254.254:38033)
0504 14:08:57.044234 (+   169us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:57.044239 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:57.044974 (+   735us) server_negotiation.cc:408] Connection header received
0504 14:08:57.045921 (+   947us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:57.045924 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:57.045983 (+    59us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:57.046069 (+    86us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:57.047886 (+  1817us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.048486 (+   600us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:57.049146 (+   660us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.049323 (+   177us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:57.052154 (+  2831us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:57.052172 (+    18us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:57.052174 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:57.052199 (+    25us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:57.053741 (+  1542us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:57.054427 (+   686us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:57.054430 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:57.054432 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:57.054479 (+    47us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:57.054863 (+   384us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:57.054865 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:57.054866 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:57.055009 (+   143us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:57.055086 (+    77us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:57.055805 (+   719us) server_negotiation.cc:300] Negotiation successful
0504 14:08:57.055918 (+   113us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":61}
WARNING: no policy specified for kudu/127.25.254.194@KRBTEST.COM; defaulting to no policy
I20260504 14:08:57.056969  5306 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:57.044400 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:38033 (local address 127.25.254.193:57787)
0504 14:08:57.044836 (+   436us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:57.044871 (+    35us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:57.045703 (+   832us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:57.046233 (+   530us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:57.046240 (+     7us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:57.046610 (+   370us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:57.047695 (+  1085us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:57.047709 (+    14us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.048615 (+   906us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:57.048619 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:57.049022 (+   403us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:57.049029 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.049243 (+   214us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:57.049882 (+   639us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:57.049901 (+    19us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:57.051974 (+  2073us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:57.053882 (+  1908us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:57.053890 (+     8us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:57.053905 (+    15us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:57.054302 (+   397us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:57.054588 (+   286us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:57.054592 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:57.054594 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:57.054714 (+   120us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:57.055127 (+   413us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:57.055136 (+     9us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:57.055430 (+   294us) client_negotiation.cc:770] Sending connection context
0504 14:08:57.055662 (+   232us) client_negotiation.cc:241] Negotiation successful
0504 14:08:57.055923 (+   261us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":258,"thread_start_us":120,"threads_started":1}
I20260504 14:08:57.058477  5304 heartbeater.cc:344] Connected to a master server at 127.25.254.254:38033
I20260504 14:08:57.058728  5304 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:57.059240  5304 heartbeater.cc:507] Master 127.25.254.254:38033 requested a full tablet report, sending...
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.194@KRBTEST.COM" created.
I20260504 14:08:57.060974  5111 ts_manager.cc:194] Registered new tserver with Master: 6c9e29bf695849c78cbed6210429e4c9 (127.25.254.193:41417)
I20260504 14:08:57.062597  5111 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.193@KRBTEST.COM'} at 127.25.254.193:57787
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:08:57.107299 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:0
--local_ip_for_outbound_sockets=127.25.254.194
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:38033
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:34957
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:57.215432  5311 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:57.215667  5311 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:57.215725  5311 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:57.219135  5311 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:57.219198  5311 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:57.219321  5311 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:08:57.223817  5311 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:34957
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:38033
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.5311
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:57.224824  5311 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:57.225607  5311 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:57.232317  5317 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:57.232317  5316 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:57.232368  5311 server_base.cc:1061] running on GCE node
W20260504 14:08:57.232317  5319 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:57.232957  5311 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:57.233515  5311 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:57.234712  5311 hybrid_clock.cc:648] HybridClock initialized: now 1777903737234648 us; error 79 us; skew 500 ppm
May 04 14:08:57 dist-test-slave-2x32 krb5kdc[5063](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903737, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:57.237601  5311 init.cc:377] Logged in from keytab as kudu/127.25.254.194@KRBTEST.COM (short username kudu)
I20260504 14:08:57.238665  5311 webserver.cc:492] Webserver started at http://127.25.254.194:39449/ using document root <none> and password file <none>
I20260504 14:08:57.239244  5311 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:57.239293  5311 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:57.239449  5311 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:57.241046  5311 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data/instance:
uuid: "a0daa4c734cb496bbd3a40040b8535e6"
format_stamp: "Formatted at 2026-05-04 14:08:57 on dist-test-slave-2x32"
server_key: "0dc1f0664b3b3917338e7e3aae329e47"
server_key_iv: "9514dd097e1274115be86d34a5e5270c"
server_key_version: "encryptionkey@0"
I20260504 14:08:57.241550  5311 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance:
uuid: "a0daa4c734cb496bbd3a40040b8535e6"
format_stamp: "Formatted at 2026-05-04 14:08:57 on dist-test-slave-2x32"
server_key: "0dc1f0664b3b3917338e7e3aae329e47"
server_key_iv: "9514dd097e1274115be86d34a5e5270c"
server_key_version: "encryptionkey@0"
I20260504 14:08:57.245065  5311 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.006s	sys 0.000s
I20260504 14:08:57.247475  5326 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:57.248512  5311 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.001s	sys 0.000s
I20260504 14:08:57.248647  5311 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "a0daa4c734cb496bbd3a40040b8535e6"
format_stamp: "Formatted at 2026-05-04 14:08:57 on dist-test-slave-2x32"
server_key: "0dc1f0664b3b3917338e7e3aae329e47"
server_key_iv: "9514dd097e1274115be86d34a5e5270c"
server_key_version: "encryptionkey@0"
I20260504 14:08:57.248751  5311 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:57.258970  5311 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:57.261576  5311 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:57.261780  5311 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:57.262409  5311 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:57.263352  5311 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:57.263419  5311 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:57.263496  5311 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:57.263540  5311 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:57.273772  5311 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:34621
I20260504 14:08:57.273800  5439 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:34621 every 8 connection(s)
I20260504 14:08:57.275229  5311 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
I20260504 14:08:57.283601 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 5311
I20260504 14:08:57.283749 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance
I20260504 14:08:57.284083 26619 external_mini_cluster.cc:1468] Setting key 27ebda4c6111133d19a454108418b46d
May 04 14:08:57 dist-test-slave-2x32 krb5kdc[5063](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903737, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:08:57.289041  5150 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:57.278966 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:34581 (local address 127.25.254.254:38033)
0504 14:08:57.279104 (+   138us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:57.279108 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:57.279130 (+    22us) server_negotiation.cc:408] Connection header received
0504 14:08:57.279476 (+   346us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:57.279479 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:57.279528 (+    49us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:57.279607 (+    79us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:57.280818 (+  1211us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.281395 (+   577us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:57.282022 (+   627us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.282243 (+   221us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:57.285021 (+  2778us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:57.285038 (+    17us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:57.285041 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:57.285064 (+    23us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:57.286693 (+  1629us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:57.287391 (+   698us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:57.287397 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:57.287401 (+     4us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:57.287464 (+    63us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:57.287781 (+   317us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:57.287786 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:57.287789 (+     3us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:57.287991 (+   202us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:57.288130 (+   139us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:57.288634 (+   504us) server_negotiation.cc:300] Negotiation successful
0504 14:08:57.288786 (+   152us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":56}
I20260504 14:08:57.289608  5442 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:57.277877 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:38033 (local address 127.25.254.194:34581)
0504 14:08:57.278483 (+   606us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:57.278521 (+    38us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:57.279296 (+   775us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:57.279724 (+   428us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:57.279732 (+     8us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:57.280127 (+   395us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:57.280599 (+   472us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:57.280610 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.281521 (+   911us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:57.281524 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:57.281893 (+   369us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:57.281899 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.282050 (+   151us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:57.283132 (+  1082us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:57.283150 (+    18us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:57.284876 (+  1726us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:57.286830 (+  1954us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:57.286839 (+     9us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:57.286856 (+    17us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:57.287250 (+   394us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:57.287583 (+   333us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:57.287586 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:57.287588 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:57.287686 (+    98us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:57.288150 (+   464us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:57.288156 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:57.288373 (+   217us) client_negotiation.cc:770] Sending connection context
0504 14:08:57.288558 (+   185us) client_negotiation.cc:241] Negotiation successful
0504 14:08:57.288810 (+   252us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":356,"thread_start_us":173,"threads_started":1}
I20260504 14:08:57.290768  5440 heartbeater.cc:344] Connected to a master server at 127.25.254.254:38033
I20260504 14:08:57.291019  5440 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:57.291536  5440 heartbeater.cc:507] Master 127.25.254.254:38033 requested a full tablet report, sending...
I20260504 14:08:57.292691  5111 ts_manager.cc:194] Registered new tserver with Master: a0daa4c734cb496bbd3a40040b8535e6 (127.25.254.194:34621)
I20260504 14:08:57.293334  5111 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.194@KRBTEST.COM'} at 127.25.254.194:34581
WARNING: no policy specified for kudu/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:08:57.341589 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:0
--local_ip_for_outbound_sockets=127.25.254.195
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:38033
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:34957
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:08:57.453802  5447 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:08:57.454046  5447 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:08:57.454113  5447 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:08:57.457643  5447 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:08:57.457716  5447 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:08:57.457854  5447 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:08:57.462360  5447 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:34957
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:38033
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.5447
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:08:57.463493  5447 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:08:57.464407  5447 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:08:57.471186  5455 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:57.471222  5452 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:08:57.471428  5453 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:08:57.471251  5447 server_base.cc:1061] running on GCE node
I20260504 14:08:57.471946  5447 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:08:57.472568  5447 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:08:57.473778  5447 hybrid_clock.cc:648] HybridClock initialized: now 1777903737473728 us; error 67 us; skew 500 ppm
May 04 14:08:57 dist-test-slave-2x32 krb5kdc[5063](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903737, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:08:57.476910  5447 init.cc:377] Logged in from keytab as kudu/127.25.254.195@KRBTEST.COM (short username kudu)
I20260504 14:08:57.478240  5447 webserver.cc:492] Webserver started at http://127.25.254.195:45021/ using document root <none> and password file <none>
I20260504 14:08:57.478828  5447 fs_manager.cc:362] Metadata directory not provided
I20260504 14:08:57.478879  5447 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:08:57.479091  5447 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260504 14:08:57.480818  5447 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data/instance:
uuid: "de387eecb1f343d2873251f2a66c6239"
format_stamp: "Formatted at 2026-05-04 14:08:57 on dist-test-slave-2x32"
server_key: "c48329b7d262cc668b1f1dbecfe2bfd8"
server_key_iv: "f93cd2d4dba439c476dbb25a6eed9046"
server_key_version: "encryptionkey@0"
I20260504 14:08:57.481364  5447 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance:
uuid: "de387eecb1f343d2873251f2a66c6239"
format_stamp: "Formatted at 2026-05-04 14:08:57 on dist-test-slave-2x32"
server_key: "c48329b7d262cc668b1f1dbecfe2bfd8"
server_key_iv: "f93cd2d4dba439c476dbb25a6eed9046"
server_key_version: "encryptionkey@0"
I20260504 14:08:57.485137  5447 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.001s	sys 0.004s
I20260504 14:08:57.487706  5462 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:57.488866  5447 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.001s	sys 0.000s
I20260504 14:08:57.489015  5447 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "de387eecb1f343d2873251f2a66c6239"
format_stamp: "Formatted at 2026-05-04 14:08:57 on dist-test-slave-2x32"
server_key: "c48329b7d262cc668b1f1dbecfe2bfd8"
server_key_iv: "f93cd2d4dba439c476dbb25a6eed9046"
server_key_version: "encryptionkey@0"
I20260504 14:08:57.489135  5447 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:08:57.522038  5447 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:08:57.525298  5447 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:08:57.525540  5447 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:08:57.526235  5447 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:08:57.527191  5447 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:08:57.527264  5447 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:57.527339  5447 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:08:57.527403  5447 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:08:57.537672  5447 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:45531
I20260504 14:08:57.538134  5575 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:45531 every 8 connection(s)
I20260504 14:08:57.538923  5447 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
I20260504 14:08:57.548349 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 5447
I20260504 14:08:57.548502 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance
May 04 14:08:57 dist-test-slave-2x32 krb5kdc[5063](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903737, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:08:57.548801 26619 external_mini_cluster.cc:1468] Setting key eea9039df848e64ca1353794e5c895f2
I20260504 14:08:57.553442  5150 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:57.540981 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:32945 (local address 127.25.254.254:38033)
0504 14:08:57.541155 (+   174us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:57.541159 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:57.541835 (+   676us) server_negotiation.cc:408] Connection header received
0504 14:08:57.542776 (+   941us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:57.542779 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:57.542834 (+    55us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:57.542922 (+    88us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:57.544342 (+  1420us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.545117 (+   775us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:57.545786 (+   669us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.545941 (+   155us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:57.549494 (+  3553us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:57.549541 (+    47us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:57.549547 (+     6us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:57.549583 (+    36us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:57.551439 (+  1856us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:57.551933 (+   494us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:57.551936 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:57.551937 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:57.551984 (+    47us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:57.552354 (+   370us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:57.552356 (+     2us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:57.552357 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:57.552507 (+   150us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:57.552587 (+    80us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:57.553133 (+   546us) server_negotiation.cc:300] Negotiation successful
0504 14:08:57.553289 (+   156us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":72}
I20260504 14:08:57.554239  5578 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:57.541264 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:38033 (local address 127.25.254.195:32945)
0504 14:08:57.541684 (+   420us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:57.541717 (+    33us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:57.542546 (+   829us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:57.543231 (+   685us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:57.543240 (+     9us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:57.543649 (+   409us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:08:57.544162 (+   513us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:57.544174 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.545256 (+  1082us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:57.545259 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:57.545643 (+   384us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:57.545649 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.545883 (+   234us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:57.547224 (+  1341us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:08:57.547245 (+    21us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:08:57.549272 (+  2027us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:08:57.551558 (+  2286us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:57.551563 (+     5us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:57.551573 (+    10us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:57.551837 (+   264us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:57.552073 (+   236us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:08:57.552076 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:08:57.552077 (+     1us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:08:57.552268 (+   191us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:08:57.552668 (+   400us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:08:57.552674 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:08:57.552902 (+   228us) client_negotiation.cc:770] Sending connection context
0504 14:08:57.553114 (+   212us) client_negotiation.cc:241] Negotiation successful
0504 14:08:57.553463 (+   349us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":234,"thread_start_us":270,"threads_started":1}
I20260504 14:08:57.555421  5576 heartbeater.cc:344] Connected to a master server at 127.25.254.254:38033
I20260504 14:08:57.555696  5576 heartbeater.cc:461] Registering TS with master...
I20260504 14:08:57.556269  5576 heartbeater.cc:507] Master 127.25.254.254:38033 requested a full tablet report, sending...
I20260504 14:08:57.557631  5111 ts_manager.cc:194] Registered new tserver with Master: de387eecb1f343d2873251f2a66c6239 (127.25.254.195:45531)
I20260504 14:08:57.558312  5111 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.195@KRBTEST.COM'} at 127.25.254.195:32945
I20260504 14:08:57.563215 26619 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
I20260504 14:08:57.572649  5150 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:57.565368 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:38860 (local address 127.25.254.254:38033)
0504 14:08:57.565548 (+   180us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:57.565553 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:57.565709 (+   156us) server_negotiation.cc:408] Connection header received
0504 14:08:57.565873 (+   164us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:57.565876 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:57.565929 (+    53us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:57.566014 (+    85us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:08:57.566925 (+   911us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.567425 (+   500us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:57.568104 (+   679us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.568352 (+   248us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:57.569415 (+  1063us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:08:57.569442 (+    27us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:08:57.569444 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:08:57.569470 (+    26us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:08:57.571045 (+  1575us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:57.571468 (+   423us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:57.571472 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:57.571474 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:57.571526 (+    52us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:08:57.571766 (+   240us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:08:57.571769 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:08:57.571772 (+     3us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:08:57.571958 (+   186us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:08:57.572104 (+   146us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:57.572343 (+   239us) server_negotiation.cc:300] Negotiation successful
0504 14:08:57.572462 (+   119us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":60}
I20260504 14:08:57.575599  5111 catalog_manager.cc:2257] Servicing CreateTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:38860:
name: "test-table"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "val"
    type: INT32
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
W20260504 14:08:57.577976  5111 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-table in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260504 14:08:57.591955  5589 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:57.587458 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:34621 (local address 127.0.0.1:50708)
0504 14:08:57.588171 (+   713us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:57.588184 (+    13us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:57.588295 (+   111us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:57.589106 (+   811us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:57.589109 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:57.589122 (+    13us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:57.589488 (+   366us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:57.589494 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.590572 (+  1078us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:57.590577 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:57.591423 (+   846us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:57.591433 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.591560 (+   127us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:57.591619 (+    59us) client_negotiation.cc:770] Sending connection context
0504 14:08:57.591712 (+    93us) client_negotiation.cc:241] Negotiation successful
0504 14:08:57.591791 (+    79us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":632,"thread_start_us":92,"threads_started":1}
I20260504 14:08:57.592412  5587 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:57.587155 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:45531 (local address 127.0.0.1:34024)
0504 14:08:57.587814 (+   659us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:57.587869 (+    55us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:57.588013 (+   144us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:57.589067 (+  1054us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:57.589074 (+     7us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:57.589109 (+    35us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:57.589429 (+   320us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:57.589438 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.591028 (+  1590us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:57.591031 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:57.592037 (+  1006us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:57.592046 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.592159 (+   113us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:57.592174 (+    15us) client_negotiation.cc:770] Sending connection context
0504 14:08:57.592225 (+    51us) client_negotiation.cc:241] Negotiation successful
0504 14:08:57.592277 (+    52us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":483,"spinlock_wait_cycles":2432,"thread_start_us":87,"threads_started":1}
I20260504 14:08:57.592746  5590 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:57.587823 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:41417 (local address 127.0.0.1:40804)
0504 14:08:57.588397 (+   574us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:57.588412 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:57.588498 (+    86us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:57.589208 (+   710us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:57.589217 (+     9us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:57.589233 (+    16us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:57.589487 (+   254us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:57.589494 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.591022 (+  1528us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:57.591026 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:57.592053 (+  1027us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:57.592059 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.592158 (+    99us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:57.592175 (+    17us) client_negotiation.cc:770] Sending connection context
0504 14:08:57.592225 (+    50us) client_negotiation.cc:241] Negotiation successful
0504 14:08:57.592277 (+    52us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":503,"thread_start_us":100,"threads_started":1}
I20260504 14:08:57.592767  5591 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:57.587847 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:50708 (local address 127.25.254.194:34621)
0504 14:08:57.588728 (+   881us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:57.588732 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:57.588750 (+    18us) server_negotiation.cc:408] Connection header received
0504 14:08:57.588815 (+    65us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:57.588818 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:57.588944 (+   126us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:57.589311 (+   367us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:57.589636 (+   325us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.590453 (+   817us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:57.591567 (+  1114us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.592129 (+   562us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:57.592252 (+   123us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:57.592322 (+    70us) server_negotiation.cc:300] Negotiation successful
0504 14:08:57.592627 (+   305us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":781,"thread_start_us":141,"threads_started":1}
I20260504 14:08:57.593354  5592 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:57.588372 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:40804 (local address 127.25.254.193:41417)
0504 14:08:57.588692 (+   320us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:57.588697 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:57.588713 (+    16us) server_negotiation.cc:408] Connection header received
0504 14:08:57.588780 (+    67us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:57.588784 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:57.588900 (+   116us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:57.589035 (+   135us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:57.589629 (+   594us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.590878 (+  1249us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:57.592349 (+  1471us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.592931 (+   582us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:57.593037 (+   106us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:57.593111 (+    74us) server_negotiation.cc:300] Negotiation successful
0504 14:08:57.593173 (+    62us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":240,"thread_start_us":86,"threads_started":1}
I20260504 14:08:57.593832  5588 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:57.587307 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:34024 (local address 127.25.254.195:45531)
0504 14:08:57.588553 (+  1246us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:57.588559 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:57.588592 (+    33us) server_negotiation.cc:408] Connection header received
0504 14:08:57.588674 (+    82us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:57.588679 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:57.588880 (+   201us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:57.589035 (+   155us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:57.589582 (+   547us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.590879 (+  1297us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:57.592552 (+  1673us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.593431 (+   879us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:57.593516 (+    85us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:57.593590 (+    74us) server_negotiation.cc:300] Negotiation successful
0504 14:08:57.593692 (+   102us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":1122,"thread_start_us":159,"threads_started":1}
I20260504 14:08:57.595912  5374 tablet_service.cc:1511] Processing CreateTablet for tablet 3e24df26d68244f4a34058bbd3184474 (DEFAULT_TABLE table=test-table [id=9d4893a4c83e482a88543207642dc5a9]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:57.595911  5238 tablet_service.cc:1511] Processing CreateTablet for tablet 3e24df26d68244f4a34058bbd3184474 (DEFAULT_TABLE table=test-table [id=9d4893a4c83e482a88543207642dc5a9]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:57.596105  5510 tablet_service.cc:1511] Processing CreateTablet for tablet 3e24df26d68244f4a34058bbd3184474 (DEFAULT_TABLE table=test-table [id=9d4893a4c83e482a88543207642dc5a9]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:08:57.596942  5374 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 3e24df26d68244f4a34058bbd3184474. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:57.596942  5238 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 3e24df26d68244f4a34058bbd3184474. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:57.597069  5510 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 3e24df26d68244f4a34058bbd3184474. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:08:57.602608  5593 tablet_bootstrap.cc:492] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239: Bootstrap starting.
I20260504 14:08:57.603688  5594 tablet_bootstrap.cc:492] T 3e24df26d68244f4a34058bbd3184474 P 6c9e29bf695849c78cbed6210429e4c9: Bootstrap starting.
I20260504 14:08:57.603919  5595 tablet_bootstrap.cc:492] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6: Bootstrap starting.
I20260504 14:08:57.605291  5593 tablet_bootstrap.cc:654] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:57.605768  5595 tablet_bootstrap.cc:654] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:57.605768  5594 tablet_bootstrap.cc:654] T 3e24df26d68244f4a34058bbd3184474 P 6c9e29bf695849c78cbed6210429e4c9: Neither blocks nor log segments found. Creating new log.
I20260504 14:08:57.606117  5593 log.cc:826] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:57.606637  5594 log.cc:826] T 3e24df26d68244f4a34058bbd3184474 P 6c9e29bf695849c78cbed6210429e4c9: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:57.606637  5595 log.cc:826] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6: Log is configured to *not* fsync() on all Append() calls
I20260504 14:08:57.608218  5593 tablet_bootstrap.cc:492] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239: No bootstrap required, opened a new log
I20260504 14:08:57.608429  5594 tablet_bootstrap.cc:492] T 3e24df26d68244f4a34058bbd3184474 P 6c9e29bf695849c78cbed6210429e4c9: No bootstrap required, opened a new log
I20260504 14:08:57.608433  5593 ts_tablet_manager.cc:1403] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239: Time spent bootstrapping tablet: real 0.006s	user 0.005s	sys 0.000s
I20260504 14:08:57.608606  5594 ts_tablet_manager.cc:1403] T 3e24df26d68244f4a34058bbd3184474 P 6c9e29bf695849c78cbed6210429e4c9: Time spent bootstrapping tablet: real 0.005s	user 0.004s	sys 0.000s
I20260504 14:08:57.609901  5595 tablet_bootstrap.cc:492] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6: No bootstrap required, opened a new log
I20260504 14:08:57.610208  5595 ts_tablet_manager.cc:1403] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6: Time spent bootstrapping tablet: real 0.006s	user 0.005s	sys 0.000s
I20260504 14:08:57.611940  5594 raft_consensus.cc:359] T 3e24df26d68244f4a34058bbd3184474 P 6c9e29bf695849c78cbed6210429e4c9 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "de387eecb1f343d2873251f2a66c6239" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 45531 } } peers { permanent_uuid: "a0daa4c734cb496bbd3a40040b8535e6" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 34621 } } peers { permanent_uuid: "6c9e29bf695849c78cbed6210429e4c9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 41417 } }
I20260504 14:08:57.612195  5594 raft_consensus.cc:385] T 3e24df26d68244f4a34058bbd3184474 P 6c9e29bf695849c78cbed6210429e4c9 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:57.612267  5594 raft_consensus.cc:740] T 3e24df26d68244f4a34058bbd3184474 P 6c9e29bf695849c78cbed6210429e4c9 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 6c9e29bf695849c78cbed6210429e4c9, State: Initialized, Role: FOLLOWER
I20260504 14:08:57.612344  5593 raft_consensus.cc:359] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "de387eecb1f343d2873251f2a66c6239" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 45531 } } peers { permanent_uuid: "a0daa4c734cb496bbd3a40040b8535e6" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 34621 } } peers { permanent_uuid: "6c9e29bf695849c78cbed6210429e4c9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 41417 } }
I20260504 14:08:57.612532  5593 raft_consensus.cc:385] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:57.612579  5593 raft_consensus.cc:740] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: de387eecb1f343d2873251f2a66c6239, State: Initialized, Role: FOLLOWER
I20260504 14:08:57.612730  5594 consensus_queue.cc:260] T 3e24df26d68244f4a34058bbd3184474 P 6c9e29bf695849c78cbed6210429e4c9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "de387eecb1f343d2873251f2a66c6239" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 45531 } } peers { permanent_uuid: "a0daa4c734cb496bbd3a40040b8535e6" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 34621 } } peers { permanent_uuid: "6c9e29bf695849c78cbed6210429e4c9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 41417 } }
I20260504 14:08:57.612955  5593 consensus_queue.cc:260] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "de387eecb1f343d2873251f2a66c6239" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 45531 } } peers { permanent_uuid: "a0daa4c734cb496bbd3a40040b8535e6" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 34621 } } peers { permanent_uuid: "6c9e29bf695849c78cbed6210429e4c9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 41417 } }
I20260504 14:08:57.613401  5595 raft_consensus.cc:359] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "de387eecb1f343d2873251f2a66c6239" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 45531 } } peers { permanent_uuid: "a0daa4c734cb496bbd3a40040b8535e6" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 34621 } } peers { permanent_uuid: "6c9e29bf695849c78cbed6210429e4c9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 41417 } }
I20260504 14:08:57.613554  5304 heartbeater.cc:499] Master 127.25.254.254:38033 was elected leader, sending a full tablet report...
I20260504 14:08:57.613619  5595 raft_consensus.cc:385] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:08:57.613685  5595 raft_consensus.cc:740] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a0daa4c734cb496bbd3a40040b8535e6, State: Initialized, Role: FOLLOWER
I20260504 14:08:57.613703  5594 ts_tablet_manager.cc:1434] T 3e24df26d68244f4a34058bbd3184474 P 6c9e29bf695849c78cbed6210429e4c9: Time spent starting tablet: real 0.005s	user 0.006s	sys 0.000s
I20260504 14:08:57.613993  5576 heartbeater.cc:499] Master 127.25.254.254:38033 was elected leader, sending a full tablet report...
I20260504 14:08:57.614171  5595 consensus_queue.cc:260] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "de387eecb1f343d2873251f2a66c6239" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 45531 } } peers { permanent_uuid: "a0daa4c734cb496bbd3a40040b8535e6" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 34621 } } peers { permanent_uuid: "6c9e29bf695849c78cbed6210429e4c9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 41417 } }
I20260504 14:08:57.614840  5440 heartbeater.cc:499] Master 127.25.254.254:38033 was elected leader, sending a full tablet report...
I20260504 14:08:57.615223  5595 ts_tablet_manager.cc:1434] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6: Time spent starting tablet: real 0.005s	user 0.004s	sys 0.000s
I20260504 14:08:57.616925  5593 ts_tablet_manager.cc:1434] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239: Time spent starting tablet: real 0.008s	user 0.005s	sys 0.000s
W20260504 14:08:57.776615  5441 tablet.cc:2404] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20260504 14:08:57.789906  5577 tablet.cc:2404] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20260504 14:08:57.793423  5305 tablet.cc:2404] T 3e24df26d68244f4a34058bbd3184474 P 6c9e29bf695849c78cbed6210429e4c9: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20260504 14:08:57.861598  5600 raft_consensus.cc:493] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:08:57.861912  5600 raft_consensus.cc:515] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "de387eecb1f343d2873251f2a66c6239" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 45531 } } peers { permanent_uuid: "a0daa4c734cb496bbd3a40040b8535e6" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 34621 } } peers { permanent_uuid: "6c9e29bf695849c78cbed6210429e4c9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 41417 } }
I20260504 14:08:57.863129  5600 leader_election.cc:290] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers a0daa4c734cb496bbd3a40040b8535e6 (127.25.254.194:34621), 6c9e29bf695849c78cbed6210429e4c9 (127.25.254.193:41417)
I20260504 14:08:57.866428  5578 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:57.863530 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:34621 (local address 127.25.254.195:50739)
0504 14:08:57.863656 (+   126us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:57.863672 (+    16us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:57.863802 (+   130us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:57.864290 (+   488us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:57.864293 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:57.864316 (+    23us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:57.864538 (+   222us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:57.864543 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.865373 (+   830us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:57.865377 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:57.865985 (+   608us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:57.865992 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.866133 (+   141us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:57.866150 (+    17us) client_negotiation.cc:770] Sending connection context
0504 14:08:57.866230 (+    80us) client_negotiation.cc:241] Negotiation successful
0504 14:08:57.866290 (+    60us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":44}
I20260504 14:08:57.866889  5591 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:57.863658 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:50739 (local address 127.25.254.194:34621)
0504 14:08:57.863819 (+   161us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:57.863823 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:57.863848 (+    25us) server_negotiation.cc:408] Connection header received
0504 14:08:57.864076 (+   228us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:57.864083 (+     7us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:57.864152 (+    69us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:57.864272 (+   120us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:57.864704 (+   432us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.865238 (+   534us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:57.866116 (+   878us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.866631 (+   515us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:57.866677 (+    46us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:57.866744 (+    67us) server_negotiation.cc:300] Negotiation successful
0504 14:08:57.866796 (+    52us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":78}
I20260504 14:08:57.867537  5394 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "3e24df26d68244f4a34058bbd3184474" candidate_uuid: "de387eecb1f343d2873251f2a66c6239" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a0daa4c734cb496bbd3a40040b8535e6" is_pre_election: true
I20260504 14:08:57.867642  5602 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:57.863536 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:41417 (local address 127.25.254.195:57979)
0504 14:08:57.864296 (+   760us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:57.864308 (+    12us) client_negotiation.cc:175] Beginning negotiation
0504 14:08:57.864426 (+   118us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:08:57.865039 (+   613us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:08:57.865042 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:08:57.865061 (+    19us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:08:57.865318 (+   257us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:57.865323 (+     5us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.866406 (+  1083us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:08:57.866410 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:08:57.867254 (+   844us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:08:57.867265 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.867385 (+   120us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:57.867399 (+    14us) client_negotiation.cc:770] Sending connection context
0504 14:08:57.867446 (+    47us) client_negotiation.cc:241] Negotiation successful
0504 14:08:57.867492 (+    46us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":696,"mutex_wait_us":411,"thread_start_us":149,"threads_started":1}
I20260504 14:08:57.867868  5394 raft_consensus.cc:2468] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate de387eecb1f343d2873251f2a66c6239 in term 0.
I20260504 14:08:57.868196  5592 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:57.863658 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:57979 (local address 127.25.254.193:41417)
0504 14:08:57.863837 (+   179us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:57.863841 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:57.864394 (+   553us) server_negotiation.cc:408] Connection header received
0504 14:08:57.864714 (+   320us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:57.864718 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:57.864864 (+   146us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:57.864997 (+   133us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:08:57.865452 (+   455us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.866287 (+   835us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:57.867399 (+  1112us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.867892 (+   493us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:57.867933 (+    41us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:57.868009 (+    76us) server_negotiation.cc:300] Negotiation successful
0504 14:08:57.868076 (+    67us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":78}
I20260504 14:08:57.868463  5466 leader_election.cc:304] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: a0daa4c734cb496bbd3a40040b8535e6, de387eecb1f343d2873251f2a66c6239; no voters: 
I20260504 14:08:57.868750  5600 raft_consensus.cc:2804] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:08:57.868849  5600 raft_consensus.cc:493] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:08:57.868911  5600 raft_consensus.cc:3060] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:57.868883  5258 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "3e24df26d68244f4a34058bbd3184474" candidate_uuid: "de387eecb1f343d2873251f2a66c6239" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "6c9e29bf695849c78cbed6210429e4c9" is_pre_election: true
I20260504 14:08:57.869164  5258 raft_consensus.cc:2468] T 3e24df26d68244f4a34058bbd3184474 P 6c9e29bf695849c78cbed6210429e4c9 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate de387eecb1f343d2873251f2a66c6239 in term 0.
I20260504 14:08:57.870044  5600 raft_consensus.cc:515] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "de387eecb1f343d2873251f2a66c6239" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 45531 } } peers { permanent_uuid: "a0daa4c734cb496bbd3a40040b8535e6" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 34621 } } peers { permanent_uuid: "6c9e29bf695849c78cbed6210429e4c9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 41417 } }
I20260504 14:08:57.870450  5600 leader_election.cc:290] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [CANDIDATE]: Term 1 election: Requested vote from peers a0daa4c734cb496bbd3a40040b8535e6 (127.25.254.194:34621), 6c9e29bf695849c78cbed6210429e4c9 (127.25.254.193:41417)
I20260504 14:08:57.870877  5258 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "3e24df26d68244f4a34058bbd3184474" candidate_uuid: "de387eecb1f343d2873251f2a66c6239" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "6c9e29bf695849c78cbed6210429e4c9"
I20260504 14:08:57.870981  5394 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "3e24df26d68244f4a34058bbd3184474" candidate_uuid: "de387eecb1f343d2873251f2a66c6239" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a0daa4c734cb496bbd3a40040b8535e6"
I20260504 14:08:57.871012  5258 raft_consensus.cc:3060] T 3e24df26d68244f4a34058bbd3184474 P 6c9e29bf695849c78cbed6210429e4c9 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:57.871107  5394 raft_consensus.cc:3060] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:08:57.872295  5394 raft_consensus.cc:2468] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate de387eecb1f343d2873251f2a66c6239 in term 1.
I20260504 14:08:57.872295  5258 raft_consensus.cc:2468] T 3e24df26d68244f4a34058bbd3184474 P 6c9e29bf695849c78cbed6210429e4c9 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate de387eecb1f343d2873251f2a66c6239 in term 1.
I20260504 14:08:57.872614  5466 leader_election.cc:304] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: a0daa4c734cb496bbd3a40040b8535e6, de387eecb1f343d2873251f2a66c6239; no voters: 
I20260504 14:08:57.872833  5600 raft_consensus.cc:2804] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:08:57.873095  5600 raft_consensus.cc:697] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [term 1 LEADER]: Becoming Leader. State: Replica: de387eecb1f343d2873251f2a66c6239, State: Running, Role: LEADER
I20260504 14:08:57.873462  5600 consensus_queue.cc:237] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "de387eecb1f343d2873251f2a66c6239" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 45531 } } peers { permanent_uuid: "a0daa4c734cb496bbd3a40040b8535e6" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 34621 } } peers { permanent_uuid: "6c9e29bf695849c78cbed6210429e4c9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 41417 } }
I20260504 14:08:57.877067  5110 catalog_manager.cc:5671] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 reported cstate change: term changed from 0 to 1, leader changed from <none> to de387eecb1f343d2873251f2a66c6239 (127.25.254.195). New cstate: current_term: 1 leader_uuid: "de387eecb1f343d2873251f2a66c6239" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "de387eecb1f343d2873251f2a66c6239" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 45531 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a0daa4c734cb496bbd3a40040b8535e6" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 34621 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "6c9e29bf695849c78cbed6210429e4c9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 41417 } health_report { overall_health: UNKNOWN } } }
I20260504 14:08:57.888638  5588 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:57.885061 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:34040 (local address 127.25.254.195:45531)
0504 14:08:57.885250 (+   189us) server_negotiation.cc:207] Beginning negotiation
0504 14:08:57.885254 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:08:57.885267 (+    13us) server_negotiation.cc:408] Connection header received
0504 14:08:57.885347 (+    80us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:08:57.885350 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:08:57.885404 (+    54us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:08:57.885492 (+    88us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:08:57.886021 (+   529us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.886626 (+   605us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:08:57.887442 (+   816us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:08:57.887604 (+   162us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:08:57.887691 (+    87us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:08:57.888116 (+   425us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:08:57.888212 (+    96us) server_negotiation.cc:1036] Waiting for connection context
0504 14:08:57.888380 (+   168us) server_negotiation.cc:300] Negotiation successful
0504 14:08:57.888463 (+    83us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":97}
I20260504 14:08:57.894677  5258 raft_consensus.cc:1275] T 3e24df26d68244f4a34058bbd3184474 P 6c9e29bf695849c78cbed6210429e4c9 [term 1 FOLLOWER]: Refusing update from remote peer de387eecb1f343d2873251f2a66c6239: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:08:57.894677  5394 raft_consensus.cc:1275] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6 [term 1 FOLLOWER]: Refusing update from remote peer de387eecb1f343d2873251f2a66c6239: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:08:57.895447  5600 consensus_queue.cc:1048] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a0daa4c734cb496bbd3a40040b8535e6" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 34621 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:08:57.895608  5603 consensus_queue.cc:1048] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [LEADER]: Connected to new peer: Peer: permanent_uuid: "6c9e29bf695849c78cbed6210429e4c9" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 41417 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:08:57.902341  5606 mvcc.cc:204] Tried to move back new op lower bound from 7282293710408646656 to 7282293710333845504. Current Snapshot: MvccSnapshot[applied={T|T < 7282293710408646656}]
I20260504 14:08:57.903210  5608 mvcc.cc:204] Tried to move back new op lower bound from 7282293710408646656 to 7282293710333845504. Current Snapshot: MvccSnapshot[applied={T|T < 7282293710408646656}]
I20260504 14:08:57.910534  5111 catalog_manager.cc:2507] Servicing SoftDeleteTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:38860:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:08:57.910799  5111 catalog_manager.cc:2755] Servicing DeleteTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:38860:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:08:57.913345  5111 catalog_manager.cc:5958] T 00000000000000000000000000000000 P bced43f8aec3497d98113569c8ba0799: Sending DeleteTablet for 3 replicas of tablet 3e24df26d68244f4a34058bbd3184474
I20260504 14:08:57.914110  5510 tablet_service.cc:1558] Processing DeleteTablet for tablet 3e24df26d68244f4a34058bbd3184474 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:57 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:34024
I20260504 14:08:57.914295  5374 tablet_service.cc:1558] Processing DeleteTablet for tablet 3e24df26d68244f4a34058bbd3184474 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:57 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:50708
I20260504 14:08:57.914328  5238 tablet_service.cc:1558] Processing DeleteTablet for tablet 3e24df26d68244f4a34058bbd3184474 with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:08:57 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:40804
I20260504 14:08:57.914685  5615 tablet_replica.cc:333] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239: stopping tablet replica
I20260504 14:08:57.914973  5615 raft_consensus.cc:2243] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [term 1 LEADER]: Raft consensus shutting down.
I20260504 14:08:57.915128  5616 tablet_replica.cc:333] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6: stopping tablet replica
I20260504 14:08:57.915230 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 5175
I20260504 14:08:57.915377  5616 raft_consensus.cc:2243] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:08:57.915410  5615 raft_consensus.cc:2272] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:57.915681  5616 raft_consensus.cc:2272] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:08:57.917230  5616 ts_tablet_manager.cc:1916] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:57.920071  5616 ts_tablet_manager.cc:1929] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:08:57.920168  5616 log.cc:1199] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-1/wal/wals/3e24df26d68244f4a34058bbd3184474
I20260504 14:08:57.920495  5616 ts_tablet_manager.cc:1950] T 3e24df26d68244f4a34058bbd3184474 P a0daa4c734cb496bbd3a40040b8535e6: Deleting consensus metadata
I20260504 14:08:57.921388  5615 ts_tablet_manager.cc:1916] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:08:57.921823  5098 catalog_manager.cc:5002] TS a0daa4c734cb496bbd3a40040b8535e6 (127.25.254.194:34621): tablet 3e24df26d68244f4a34058bbd3184474 (table test-table [id=9d4893a4c83e482a88543207642dc5a9]) successfully deleted
W20260504 14:08:57.923532  5097 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv got EOF from 127.25.254.193:41417 (error 108)
I20260504 14:08:57.924634  5590 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:08:57.924330 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:41417 (local address 127.0.0.1:40806)
0504 14:08:57.924480 (+   150us) negotiation.cc:107] Waiting for socket to connect
0504 14:08:57.924544 (+    64us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.193:41417: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":56}
I20260504 14:08:57.924700 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 5311
I20260504 14:08:57.924665  5615 ts_tablet_manager.cc:1929] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:08:57.924986  5615 log.cc:1199] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestRequireAuthenticationSecureCluster.1777903638260922-26619-0/minicluster-data/ts-2/wal/wals/3e24df26d68244f4a34058bbd3184474
I20260504 14:08:57.925400  5615 ts_tablet_manager.cc:1950] T 3e24df26d68244f4a34058bbd3184474 P de387eecb1f343d2873251f2a66c6239: Deleting consensus metadata
W20260504 14:08:57.930371  5097 catalog_manager.cc:4729] TS 6c9e29bf695849c78cbed6210429e4c9 (127.25.254.193:41417): DeleteTablet:TABLET_DATA_DELETED RPC failed for tablet 3e24df26d68244f4a34058bbd3184474: Network error: Client connection negotiation failed: client connection to 127.25.254.193:41417: connect: Connection refused (error 111)
I20260504 14:08:57.930696  5097 catalog_manager.cc:5002] TS de387eecb1f343d2873251f2a66c6239 (127.25.254.195:45531): tablet 3e24df26d68244f4a34058bbd3184474 (table test-table [id=9d4893a4c83e482a88543207642dc5a9]) successfully deleted
I20260504 14:08:57.932476 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 5447
I20260504 14:08:57.938326 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 5079
2026-05-04T14:08:57Z chronyd exiting
[       OK ] SecurityITest.TestRequireAuthenticationSecureCluster (3610 ms)
[ RUN      ] SecurityITest.TestEncryptionWithKMSIntegration
Loading random data
Initializing database '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/principal' for realm 'KRBTEST.COM',
master key name 'K/M@KRBTEST.COM'
May 04 14:08:57 dist-test-slave-2x32 krb5kdc[5621](info): setting up network...
krb5kdc: setsockopt(10,IPV6_V6ONLY,1) worked
May 04 14:08:58 dist-test-slave-2x32 krb5kdc[5621](info): set up 2 sockets
May 04 14:08:58 dist-test-slave-2x32 krb5kdc[5621](info): commencing operation
krb5kdc: starting...
W20260504 14:08:59.987160 26619 mini_kdc.cc:121] Time spent starting KDC: real 2.025s	user 0.003s	sys 0.003s
WARNING: no policy specified for test-admin@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-admin@KRBTEST.COM" created.
WARNING: no policy specified for test-user@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-user@KRBTEST.COM" created.
WARNING: no policy specified for joe-interloper@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "joe-interloper@KRBTEST.COM" created.
Authenticating as principal slave/admin@KRBTEST.COM with password.
Entry for principal test-user with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/test-user.keytab.
Entry for principal test-user with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/test-user.keytab.
May 04 14:09:00 dist-test-slave-2x32 krb5kdc[5621](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903740, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
2026-05-04T14:09:00Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:09:00Z Disabled control of system clock
WARNING: no policy specified for rangeradmin/127.25.254.212@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "rangeradmin/127.25.254.212@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal rangeradmin/127.25.254.212@KRBTEST.COM with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/rangeradmin_127.25.254.212@KRBTEST.COM.keytab.
Entry for principal rangeradmin/127.25.254.212@KRBTEST.COM with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/rangeradmin_127.25.254.212@KRBTEST.COM.keytab.
WARNING: no policy specified for rangerlookup/127.25.254.212@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "rangerlookup/127.25.254.212@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal rangerlookup/127.25.254.212@KRBTEST.COM with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/rangerlookup_127.25.254.212@KRBTEST.COM.keytab.
Entry for principal rangerlookup/127.25.254.212@KRBTEST.COM with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/rangerlookup_127.25.254.212@KRBTEST.COM.keytab.
WARNING: no policy specified for HTTP/127.25.254.212@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.212@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.212@KRBTEST.COM with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/HTTP_127.25.254.212@KRBTEST.COM.keytab.
Entry for principal HTTP/127.25.254.212@KRBTEST.COM with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/HTTP_127.25.254.212@KRBTEST.COM.keytab.
I20260504 14:09:00.174081 26619 mini_postgres.cc:62] Running initdb...
The files belonging to this database system will be owned by user "slave".
This user must also own the server process.

The database cluster will be initialized with locale "C".
The default database encoding has accordingly been set to "SQL_ASCII".
The default text search configuration will be set to "english".

Data page checksums are disabled.

creating directory /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/postgres ... ok
creating subdirectories ... ok
selecting dynamic shared memory implementation ... posix
selecting default "max_connections" ... 100
selecting default "shared_buffers" ... 128MB
selecting default time zone ... Etc/UTC
creating configuration files ... ok
running bootstrap script ... ok
performing post-bootstrap initialization ... ok
syncing data to disk ... ok

initdb: warning: enabling "trust" authentication for local connections
initdb: hint: You can change this by editing pg_hba.conf or using the option -A, or --auth-local and --auth-host, the next time you run initdb.

Success. You can now start the database server using:

    /tmp/dist-test-taskMMfo7I/build/debug/bin/postgres/pg_ctl -D /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/postgres -l logfile start

2026-05-04 14:09:03.441 UTC [5652] LOG:  starting PostgreSQL 17.2 on x86_64-pc-linux-gnu, compiled by gcc (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0, 64-bit
2026-05-04 14:09:03.441 UTC [5652] LOG:  listening on IPv4 address "127.25.254.212", port 51003
2026-05-04 14:09:03.448 UTC [5652] LOG:  listening on Unix socket "/tmp/.s.PGSQL.51003"
2026-05-04 14:09:03.456 UTC [5657] LOG:  database system was shut down at 2026-05-04 14:09:01 UTC
2026-05-04 14:09:03.463 UTC [5652] LOG:  database system is ready to accept connections
I20260504 14:09:05.418929 26619 mini_postgres.cc:96] Postgres bound to 51003
2026-05-04 14:09:05.425 UTC [5663] FATAL:  database "slave" does not exist
127.25.254.212:51003 - accepting connections
I20260504 14:09:05.426232 26619 mini_ranger.cc:162] Starting Ranger...
I20260504 14:09:05.449517 26619 mini_ranger.cc:85] Created miniranger Postgres user
I20260504 14:09:05.530095 26619 mini_ranger.cc:88] Created ranger Postgres database
I20260504 14:09:05.530355 26619 mini_ranger.cc:179] Starting Ranger out of /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-admin
2026-05-04 14:09:06,133  [I] DB FLAVOR :POSTGRES
2026-05-04 14:09:06,134  [I] --------- Verifying Ranger DB connection ---------
2026-05-04 14:09:06,134  [I] Checking connection..
2026-05-04 14:09:06,134  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select 1;"
2026-05-04 14:09:06,536  [I] Checking connection passed.
2026-05-04 14:09:06,536  [I] --------- Verifying version history table ---------
2026-05-04 14:09:06,536  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select * from (select table_name from information_schema.tables where table_catalog='ranger' and table_name = 'x_db_version_h') as temp;"
2026-05-04 14:09:06,887  [I] Table x_db_version_h does not exist in database ranger
2026-05-04 14:09:06,888  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select * from (select table_name from information_schema.tables where table_catalog='ranger' and table_name = 'x_db_version_h') as temp;"
2026-05-04 14:09:07,245  [I] Table x_db_version_h does not exist in database ranger
2026-05-04 14:09:07,245  [I] Importing x_db_version_h table schema to database ranger from file: create_dbversion_catalog.sql
2026-05-04 14:09:07,245  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \; -input /tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/db/postgres/create_dbversion_catalog.sql 
2026-05-04 14:09:07.594 UTC [5779] WARNING:  there is no transaction in progress
2026-05-04 14:09:07,611  [I] create_dbversion_catalog.sql file imported successfully
2026-05-04 14:09:07,611  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select * from (select table_name from information_schema.tables where table_catalog='ranger' and table_name = 'x_db_version_h') as temp;"
2026-05-04 14:09:07,914  [I] Table x_db_version_h already exists in database 'ranger'
2026-05-04 14:09:07,914  [I] --------- Importing Ranger Core DB Schema ---------
2026-05-04 14:09:07,914  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select version from x_db_version_h where version = 'CORE_DB_SCHEMA' and active = 'Y';"
2026-05-04 14:09:08,294  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select version from x_db_version_h where version = 'CORE_DB_SCHEMA' and active = 'N';"
2026-05-04 14:09:08,654  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "insert into x_db_version_h (version, inst_at, inst_by, updated_at, updated_by,active) values ('CORE_DB_SCHEMA', current_timestamp, 'Ranger 2.6.0', current_timestamp, 'dist-test-slave-2x32.c.gcp-upstream.internal','N') ;"
2026-05-04 14:09:08,958  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select * from (select table_name from information_schema.tables where table_catalog='ranger' and table_name = 'x_portal_user') as temp;"
2026-05-04 14:09:09,268  [I] Table x_portal_user does not exist in database ranger
2026-05-04 14:09:09,269  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select * from (select table_name from information_schema.tables where table_catalog='ranger' and table_name = 'x_policy_ref_group') as temp;"
2026-05-04 14:09:09,586  [I] Table x_policy_ref_group does not exist in database ranger
2026-05-04 14:09:09,587  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select version from x_db_version_h where version = 'DB_PATCHES' and active = 'Y';"
2026-05-04 14:09:09,894  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select version from x_db_version_h where version = 'JAVA_PATCHES' and active = 'Y';"
2026-05-04 14:09:10,252  [I] Importing DB schema to database ranger from file: ranger_core_db_postgres.sql
2026-05-04 14:09:10,252  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \; -input /tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/db/postgres/optimized/current/ranger_core_db_postgres.sql 
2026-05-04 14:09:11.236 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.249 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.263 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.276 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.292 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.304 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.356 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.364 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.375 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.386 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.396 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.407 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.415 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.426 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.432 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.441 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.450 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.458 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.464 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.472 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.476 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:11.480 UTC [5977] WARNING:  there is no transaction in progress
2026-05-04 14:09:12,084  [I] ranger_core_db_postgres.sql file imported successfully
2026-05-04 14:09:12,085  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "update x_db_version_h set inst_by='Ranger 2.6.0' where active='Y' and updated_by='localhost';"
2026-05-04 14:09:12,389  [I] Patches status entries updated from base ranger version to current installed ranger version:Ranger 2.6.0
2026-05-04 14:09:12,389  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select * from (select table_name from information_schema.tables where table_catalog='ranger' and table_name = 'x_portal_user') as temp;"
2026-05-04 14:09:12,747  [I] Table x_portal_user already exists in database 'ranger'
2026-05-04 14:09:12,747  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select * from (select table_name from information_schema.tables where table_catalog='ranger' and table_name = 'x_policy_ref_group') as temp;"
2026-05-04 14:09:13,107  [I] Table x_policy_ref_group already exists in database 'ranger'
2026-05-04 14:09:13,108  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select version from x_db_version_h where version = 'DB_PATCHES' and active = 'Y';"
2026-05-04 14:09:13,457  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select version from x_db_version_h where version = 'JAVA_PATCHES' and active = 'Y';"
2026-05-04 14:09:13,788  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "update x_db_version_h set active='Y' where version='CORE_DB_SCHEMA' and active='N' and updated_by='dist-test-slave-2x32.c.gcp-upstream.internal';"
2026-05-04 14:09:14,090  [I] CORE_DB_SCHEMA import status has been updated
2026-05-04 14:09:14,090  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select version from x_db_version_h where version = 'DB_PATCHES' and inst_by = 'Ranger 2.6.0' and active = 'Y';"
2026-05-04 14:09:14,464  [I] DB_PATCHES have already been applied
I20260504 14:09:14.470790 26619 mini_ranger.cc:192] Using Ranger class path: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-admin:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/ews/lib/*:/usr/lib/jvm/temurin-17-jdk-amd64/lib/*:/tmp/dist-test-taskMMfo7I/thirdparty/src/hadoop-3.4.1/*:/tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/ews/webapp
I20260504 14:09:14.470854 26619 mini_ranger.cc:194] Using host: 127.25.254.212
I20260504 14:09:14.473832 26619 mini_ranger.cc:240] Ranger admin URL: http://127.25.254.212:44445
May 04, 2026 2:09:15 PM org.apache.ranger.server.tomcat.EmbeddedServer getKeyManagers
WARNING: Config 'ranger.keystore.file' or 'ranger.service.https.attrib.keystore.file' is not found or contains blank value
May 04, 2026 2:09:15 PM org.apache.ranger.server.tomcat.EmbeddedServer getTrustManagers
WARNING: Config 'ranger.truststore.file' is not found or contains blank value!
May 04, 2026 2:09:15 PM org.apache.ranger.server.tomcat.EmbeddedServer start
INFO: Deriving webapp folder from catalina.base property. folder=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-admin/ews/webapp
May 04, 2026 2:09:15 PM org.apache.ranger.server.tomcat.EmbeddedServer start
INFO: Webapp file =/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-admin/ews/webapp, webAppName = /
May 04, 2026 2:09:15 PM org.apache.ranger.server.tomcat.EmbeddedServer start
INFO: Adding webapp [/] = path [/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-admin/ews/webapp] .....
May 04, 2026 2:09:15 PM org.apache.catalina.core.StandardContext setPath
WARNING: A context path must either be an empty string or start with a '/' and do not end with a '/'. The path [/] does not meet these criteria and has been changed to []
May 04, 2026 2:09:15 PM org.apache.ranger.server.tomcat.EmbeddedServer start
INFO: Finished init of webapp [/] = path [/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-admin/ews/webapp].
May 04, 2026 2:09:15 PM org.apache.ranger.server.tomcat.EmbeddedServer startServer
INFO: Server Name : miniranger
May 04, 2026 2:09:16 PM org.apache.coyote.AbstractProtocol init
INFO: Initializing ProtocolHandler ["http-nio-44445"]
May 04, 2026 2:09:16 PM org.apache.catalina.core.StandardService startInternal
INFO: Starting service [Tomcat]
May 04, 2026 2:09:16 PM org.apache.catalina.core.StandardEngine startInternal
INFO: Starting Servlet engine: [Apache Tomcat/9.0.98]
I20260504 14:09:16.798314 26619 mini_ranger.cc:161] Time spent starting Ranger: real 11.372s	user 0.000s	sys 0.011s
May 04, 2026 2:09:16 PM org.apache.catalina.startup.ContextConfig getDefaultWebXmlFragment
INFO: No global web.xml found
May 04, 2026 2:09:21 PM org.apache.jasper.servlet.TldScanner scanJars
INFO: At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
May 04, 2026 2:09:21 PM org.apache.catalina.core.ApplicationContext log
INFO: Initializing Spring root WebApplicationContext
[EL Warning]: metadata: 2026-05-04 14:09:25.717--ServerSession(208972530)--You have specified multiple ids for the entity class [org.apache.ranger.entity.view.VXXPrincipal] without specifying an @IdClass. By doing this you may lose the ability to find by identity, distributed cache support etc. Note: You may however use EntityManager find operations by passing a list of primary key fields. Else, you will have to use JPQL queries to read your entities. For other id options see @PrimaryKey.
May 04, 2026 2:09:41 PM com.sun.jersey.api.core.PackagesResourceConfig init
INFO: Scanning for root resource and provider classes in the packages:
  org.apache.ranger.rest
  org.apache.ranger.common
  xa.rest
May 04, 2026 2:09:41 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
INFO: Root resource classes found:
  class org.apache.ranger.rest.UserREST
  class org.apache.ranger.rest.MetricsREST
  class org.apache.ranger.rest.XUserREST
  class org.apache.ranger.rest.XAuditREST
  class org.apache.ranger.rest.TagREST
  class org.apache.ranger.rest.XKeyREST
  class org.apache.ranger.rest.AssetREST
  class org.apache.ranger.rest.PublicAPIsv2
  class org.apache.ranger.rest.PublicAPIs
  class org.apache.ranger.rest.RoleREST
  class org.apache.ranger.rest.SecurityZoneREST
  class org.apache.ranger.rest.ServiceREST
May 04, 2026 2:09:41 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
INFO: Provider classes found:
  class org.apache.ranger.common.RangerJAXBContextResolver
  class org.apache.ranger.common.RangerJsonProvider
  class org.apache.ranger.common.RangerJsonMappingExceptionMapper
  class org.apache.ranger.common.RangerJsonParserExceptionMapper
May 04, 2026 2:09:42 PM com.sun.jersey.spi.spring.container.servlet.SpringServlet getContext
INFO: Using default applicationContext
May 04, 2026 2:09:42 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, rangerJsonMappingExceptionMapper, of type org.apache.ranger.common.RangerJsonMappingExceptionMapper as a provider class
May 04, 2026 2:09:42 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, rangerJsonParserExceptionMapper, of type org.apache.ranger.common.RangerJsonParserExceptionMapper as a provider class
May 04, 2026 2:09:42 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, rangerJsonProvider, of type org.apache.ranger.common.RangerJsonProvider as a provider class
May 04, 2026 2:09:42 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, assetREST, of type org.apache.ranger.rest.AssetREST as a root resource class
May 04, 2026 2:09:42 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, metricsREST, of type org.apache.ranger.rest.MetricsREST as a root resource class
May 04, 2026 2:09:42 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, publicAPIs, of type org.apache.ranger.rest.PublicAPIs as a root resource class
May 04, 2026 2:09:42 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, publicAPIsv2, of type org.apache.ranger.rest.PublicAPIsv2 as a root resource class
May 04, 2026 2:09:42 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, roleREST, of type org.apache.ranger.rest.RoleREST as a root resource class
May 04, 2026 2:09:42 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, securityZoneREST, of type org.apache.ranger.rest.SecurityZoneREST as a root resource class
May 04, 2026 2:09:42 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, serviceREST, of type org.apache.ranger.rest.ServiceREST as a root resource class
May 04, 2026 2:09:42 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, tagREST, of type org.apache.ranger.rest.TagREST as a root resource class
May 04, 2026 2:09:42 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, userREST, of type org.apache.ranger.rest.UserREST as a root resource class
May 04, 2026 2:09:42 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, XAuditREST, of type org.apache.ranger.rest.XAuditREST as a root resource class
May 04, 2026 2:09:42 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, XKeyREST, of type org.apache.ranger.rest.XKeyREST as a root resource class
May 04, 2026 2:09:42 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, XUserREST, of type org.apache.ranger.rest.XUserREST as a root resource class
May 04, 2026 2:09:42 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.19.4 05/24/2017 03:20 PM'
May 04, 2026 2:09:43 PM com.sun.jersey.spi.inject.Errors processErrorMessages
WARNING: The following warnings have been detected with resource and/or provider classes:
  WARNING: A HTTP GET method, public void org.apache.ranger.rest.RoleREST.getRolesInJson(javax.servlet.http.HttpServletRequest,javax.servlet.http.HttpServletResponse), MUST return a non-void type.
  WARNING: A HTTP GET method, public void org.apache.ranger.rest.ServiceREST.getPoliciesInExcel(javax.servlet.http.HttpServletRequest,javax.servlet.http.HttpServletResponse), MUST return a non-void type.
  WARNING: A HTTP GET method, public void org.apache.ranger.rest.ServiceREST.getPoliciesInCsv(javax.servlet.http.HttpServletRequest,javax.servlet.http.HttpServletResponse) throws java.io.IOException, MUST return a non-void type.
  WARNING: A HTTP GET method, public void org.apache.ranger.rest.ServiceREST.getPoliciesInJson(javax.servlet.http.HttpServletRequest,javax.servlet.http.HttpServletResponse,java.lang.Boolean), MUST return a non-void type.
May 04, 2026 2:09:43 PM org.apache.coyote.AbstractProtocol start
INFO: Starting ProtocolHandler ["http-nio-44445"]
I20260504 14:09:44.441789 26619 mini_ranger.cc:274] Created Kudu service
WARNING: no policy specified for rangerkms/127.25.254.212@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "rangerkms/127.25.254.212@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal rangerkms/127.25.254.212@KRBTEST.COM with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/rangerkms_127.25.254.212@KRBTEST.COM.keytab.
Entry for principal rangerkms/127.25.254.212@KRBTEST.COM with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/rangerkms_127.25.254.212@KRBTEST.COM.keytab.
WARNING: no policy specified for HTTP/127.25.254.212@KRBTEST.COM; defaulting to no policy
add_principal: Principal or policy already exists while creating "HTTP/127.25.254.212@KRBTEST.COM".
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.212@KRBTEST.COM with kvno 3, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/HTTP_127.25.254.212@KRBTEST.COM.keytab.
Entry for principal HTTP/127.25.254.212@KRBTEST.COM with kvno 3, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/HTTP_127.25.254.212@KRBTEST.COM.keytab.
WARNING: no policy specified for keyadmin@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "keyadmin@KRBTEST.COM" created.
May 04 14:09:44 dist-test-slave-2x32 krb5kdc[5621](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903784, etypes {rep=17 tkt=17 ses=17}, keyadmin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for keyadmin@KRBTEST.COM: 
I20260504 14:09:44.552394 26619 mini_ranger_kms.cc:208] Starting Ranger KMS...
I20260504 14:09:44.565768 26619 mini_ranger_kms.cc:78] Created minirangerkms Postgres user
I20260504 14:09:44.702540 26619 mini_ranger_kms.cc:81] Created rangerkms Postgres database
I20260504 14:09:44.702664 26619 mini_ranger_kms.cc:226] Starting Ranger KMS out of /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms
I20260504 14:09:44.702730 26619 mini_ranger_kms.cc:227] Using postgres at 127.25.254.212:51003
2026-05-04 14:09:45,422  [I] DB FLAVOR :POSTGRES
2026-05-04 14:09:45,423  [I] --------- Verifying Ranger DB connection ---------
2026-05-04 14:09:45,423  [I] Checking connection
2026-05-04 14:09:45,423  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java   -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/rangerkms -u rangerkms -p '********' -noheader -trim -c \; -query "SELECT 1;"
2026-05-04 14:09:45,782  [I] connection success
2026-05-04 14:09:45,782  [I] --------- Verifying Ranger DB tables ---------
2026-05-04 14:09:45,782  [I] Verifying table ranger_masterkey in database rangerkms
2026-05-04 14:09:45,782  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java   -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/rangerkms -u rangerkms -p '********' -noheader -trim -c \; -query "select * from (select table_name from information_schema.tables where table_catalog='rangerkms' and table_name = 'ranger_masterkey') as temp;"
2026-05-04 14:09:46,109  [I] Table ranger_masterkey does not exist in database rangerkms
2026-05-04 14:09:46,109  [I] --------- Importing Ranger Core DB Schema ---------
2026-05-04 14:09:46,109  [I] Importing db schema to database rangerkms from file: kms_core_db_postgres.sql
2026-05-04 14:09:46,109  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java   -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:51003/rangerkms -u rangerkms -p '********' -noheader -trim -c \; -input /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/db/postgres/kms_core_db_postgres.sql
2026-05-04 14:09:46,460  [I] kms_core_db_postgres.sql DB schema imported successfully
I20260504 14:09:46.943768 26619 mini_ranger_kms.cc:326] Created kms service
I20260504 14:09:47.131633 26619 mini_ranger_kms.cc:342] Created kudu user
I20260504 14:09:47.183369 26619 mini_ranger_kms.cc:359] Created rangerkms user
May 04 14:09:47 dist-test-slave-2x32 krb5kdc[5621](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903784, etypes {rep=17 tkt=17 ses=17}, keyadmin@KRBTEST.COM for HTTP/127.25.254.212@KRBTEST.COM
I20260504 14:09:47.649868 26619 mini_ranger_kms.cc:400] Added ranger policy
I20260504 14:09:47.650050 26619 mini_ranger_kms.cc:238] Using RangerKMS classpath: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms:/tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/webapp/WEB-INF/classes/lib/*:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/webapp/WEB-INF/lib/*:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/webapp/lib/*:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/lib/*:/usr/lib/jvm/temurin-17-jdk-amd64/lib/*:/tmp/dist-test-taskMMfo7I/thirdparty/src/hadoop-3.4.1/conf
I20260504 14:09:47.650071 26619 mini_ranger_kms.cc:240] Using host: 127.25.254.212
I20260504 14:09:47.654260 26619 mini_ranger_kms.cc:292] Ranger KMS PID: 6277
I20260504 14:09:47.654388 26619 mini_ranger_kms.cc:293] Ranger KMS URL: http://127.25.254.212:34499
14:09:48.501 [main] DEBUG org.apache.hadoop.util.Shell -- setsid exited with exit code 0
14:09:48.517 [main] DEBUG org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider -- backing jks path initialized to file:/etc/ranger/kms/rangerkms.jceks
14:09:48.645 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"GetGroups"})
14:09:48.648 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Rate of failed kerberos logins and latency (milliseconds)"})
14:09:48.648 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Rate of successful kerberos logins and latency (milliseconds)"})
14:09:48.649 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field private org.apache.hadoop.metrics2.lib.MutableGaugeInt org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailures with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Renewal failures since last successful login"})
14:09:48.650 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field private org.apache.hadoop.metrics2.lib.MutableGaugeLong org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailuresTotal with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Renewal failures since startup"})
14:09:48.658 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- UgiMetrics, User and group related metrics
14:09:48.693 [main] DEBUG org.apache.hadoop.security.SecurityUtil -- Setting hadoop.security.token.service.use_ip to true
14:09:48.737 [main] DEBUG org.apache.hadoop.security.Groups --  Creating new Groups object
14:09:48.773 [main] DEBUG org.apache.hadoop.security.Groups -- Group mapping impl=org.apache.hadoop.security.NullGroupsMapping; cacheTimeout=300000; warningDeltaMs=5000
14:09:48.856 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- Hadoop login
14:09:48.860 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- hadoop login commit
14:09:48.861 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- Using kerberos user: keyadmin@KRBTEST.COM
14:09:48.863 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- Using user: "keyadmin@KRBTEST.COM" with name: keyadmin@KRBTEST.COM
14:09:48.863 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- User entry: "keyadmin@KRBTEST.COM"
14:09:48.864 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- UGI loginUser: keyadmin@KRBTEST.COM (auth:KERBEROS)
14:09:48.867 [TGT Renewer for keyadmin@KRBTEST.COM] DEBUG org.apache.hadoop.security.UserGroupInformation -- Current time is 1777903788867, next refresh is 1777972904000
14:09:48.868 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Starting: Acquiring creator semaphore for file:///etc/ranger/kms/rangerkms.jceks
14:09:48.868 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Acquiring creator semaphore for file:///etc/ranger/kms/rangerkms.jceks: duration 0:00.001s
14:09:48.870 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Starting: Creating FS file:///etc/ranger/kms/rangerkms.jceks
14:09:48.870 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Loading filesystems
14:09:48.886 [main] DEBUG org.apache.hadoop.fs.FileSystem -- file:// = class org.apache.hadoop.fs.LocalFileSystem from /tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:09:48.892 [main] DEBUG org.apache.hadoop.fs.FileSystem -- viewfs:// = class org.apache.hadoop.fs.viewfs.ViewFileSystem from /tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:09:48.895 [main] DEBUG org.apache.hadoop.fs.FileSystem -- har:// = class org.apache.hadoop.fs.HarFileSystem from /tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:09:48.898 [main] DEBUG org.apache.hadoop.fs.FileSystem -- http:// = class org.apache.hadoop.fs.http.HttpFileSystem from /tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:09:48.899 [main] DEBUG org.apache.hadoop.fs.FileSystem -- https:// = class org.apache.hadoop.fs.http.HttpsFileSystem from /tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:09:48.900 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Looking for FS supporting file
14:09:48.900 [main] DEBUG org.apache.hadoop.fs.FileSystem -- looking for configuration option fs.file.impl
14:09:48.900 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Looking in service filesystems for implementation class
14:09:48.900 [main] DEBUG org.apache.hadoop.fs.FileSystem -- FS for file is class org.apache.hadoop.fs.LocalFileSystem
14:09:48.908 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Creating FS file:///etc/ranger/kms/rangerkms.jceks: duration 0:00.038s
May 04, 2026 2:09:48 PM org.apache.ranger.server.tomcat.EmbeddedServer getKeyManagers
WARNING: Config 'ranger.keystore.file' or 'ranger.service.https.attrib.keystore.file' is not found or contains blank value
May 04, 2026 2:09:48 PM org.apache.ranger.server.tomcat.EmbeddedServer getTrustManagers
WARNING: Config 'ranger.truststore.file' is not found or contains blank value!
May 04, 2026 2:09:49 PM org.apache.ranger.server.tomcat.EmbeddedServer start
INFO: Webapp file =/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp, webAppName = /kms
May 04, 2026 2:09:49 PM org.apache.ranger.server.tomcat.EmbeddedServer start
INFO: Adding webapp [/kms] = path [/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp] .....
May 04, 2026 2:09:49 PM org.apache.ranger.server.tomcat.EmbeddedServer start
INFO: Finished init of webapp [/kms] = path [/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp].
May 04, 2026 2:09:49 PM org.apache.ranger.server.tomcat.EmbeddedServer startServer
INFO: Server Name : minirangerkms
May 04, 2026 2:09:49 PM org.apache.coyote.AbstractProtocol init
INFO: Initializing ProtocolHandler ["http-nio-34499"]
May 04, 2026 2:09:49 PM org.apache.catalina.core.StandardService startInternal
INFO: Starting service [Tomcat]
May 04, 2026 2:09:49 PM org.apache.catalina.core.StandardEngine startInternal
INFO: Starting Servlet engine: [Apache Tomcat/9.0.98]
I20260504 14:09:49.966626 26619 mini_ranger_kms.cc:207] Time spent starting Ranger KMS: real 5.414s	user 0.015s	sys 0.240s
I20260504 14:09:49.966760 26619 mini_ranger_kms.cc:413] {"name":"kuduclusterkey","cipher":"AES/CTR/NoPadding","length":128,"description":"kuduclusterkey"}
I20260504 14:09:49.966848 26619 mini_ranger_kms.cc:417] 127.25.254.212:34499/kms/v1/keys
May 04, 2026 2:09:50 PM org.apache.catalina.startup.ContextConfig getDefaultWebXmlFragment
INFO: No global web.xml found
May 04, 2026 2:10:02 PM org.apache.jasper.servlet.TldScanner scanJars
INFO: At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
14:10:02.978 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"GetGroups"})
14:10:02.982 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Rate of failed kerberos logins and latency (milliseconds)"})
14:10:02.982 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Rate of successful kerberos logins and latency (milliseconds)"})
14:10:02.983 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field private org.apache.hadoop.metrics2.lib.MutableGaugeInt org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailures with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Renewal failures since last successful login"})
14:10:02.983 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field private org.apache.hadoop.metrics2.lib.MutableGaugeLong org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailuresTotal with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Renewal failures since startup"})
14:10:02.985 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- UgiMetrics, User and group related metrics
14:10:03.050 [main] DEBUG org.apache.hadoop.util.Shell -- setsid exited with exit code 0
14:10:03.051 [main] DEBUG org.apache.hadoop.security.SecurityUtil -- Setting hadoop.security.token.service.use_ip to true
14:10:03.072 [main] DEBUG org.apache.hadoop.security.Groups --  Creating new Groups object
14:10:03.140 [main] DEBUG org.apache.hadoop.security.Groups -- Group mapping impl=org.apache.hadoop.security.NullGroupsMapping; cacheTimeout=300000; warningDeltaMs=5000
14:10:03.140 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- -------------------------------------------------------------
14:10:03.140 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp --   Java runtime version : 17.0.18+8
14:10:03.147 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp --   KMS Hadoop Version: 3.3.6
14:10:03.147 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- -------------------------------------------------------------
14:10:03.167 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.RangerKmsAuthorizer()
14:10:03.167 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.init()
14:10:03.170 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- ==> RangerPluginClassLoaderUtil.getPluginFilesForServiceTypeAndPluginclass(kms) Pluging Class :org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer
14:10:03.170 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- ==> RangerPluginClassLoaderUtil.getPluginImplLibPath for Class (org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer)
14:10:03.171 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- <== RangerPluginClassLoaderUtil.getPluginImplLibPath for Class (org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer PATH :/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl)
14:10:03.171 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- ==> RangerPluginClassLoaderUtil.getPluginFiles()
14:10:03.171 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- ==> RangerPluginClassLoaderUtil.getPluginFiles()
14:10:03.172 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/lucene-core-8.11.3.jar
14:10:03.172 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/ranger-plugins-audit-2.6.0.jar
14:10:03.172 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/elasticsearch-rest-high-level-client-7.10.2.jar
14:10:03.172 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/jna-5.7.0.jar
14:10:03.172 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/gethostname4j-1.0.0.jar
14:10:03.172 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/elasticsearch-rest-client-7.10.2.jar
14:10:03.173 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/httpclient-4.5.13.jar
14:10:03.173 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/lang-mustache-client-7.10.2.jar
14:10:03.173 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/gson-2.9.0.jar
14:10:03.173 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/joda-time-2.10.6.jar
14:10:03.173 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/rank-eval-client-7.10.2.jar
14:10:03.173 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/httpcore-nio-4.4.14.jar
14:10:03.174 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/httpmime-4.5.13.jar
14:10:03.174 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/ranger-kms-plugin-2.6.0.jar
14:10:03.174 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/jna-platform-5.7.0.jar
14:10:03.174 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/hppc-0.8.0.jar
14:10:03.174 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/commons-collections-3.2.2.jar
14:10:03.174 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/solr-solrj-8.11.3.jar
14:10:03.175 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/hive-storage-api-2.7.2.jar
14:10:03.182 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/zookeeper-3.9.2.jar
14:10:03.182 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/commons-logging-1.2.jar
14:10:03.182 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/elasticsearch-7.10.2.jar
14:10:03.182 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/orc-core-1.5.8.jar
14:10:03.182 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/javax.persistence-2.1.0.jar
14:10:03.183 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/commons-configuration2-2.8.0.jar
14:10:03.183 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/orc-shims-1.5.8.jar
14:10:03.183 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/httpasyncclient-4.1.4.jar
14:10:03.183 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/aircompressor-0.27.jar
14:10:03.183 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/ranger-plugins-cred-2.6.0.jar
14:10:03.183 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/elasticsearch-x-content-7.10.2.jar
14:10:03.183 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/ranger-plugins-common-2.6.0.jar
14:10:03.184 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/eclipselink-2.7.12.jar
14:10:03.184 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/httpcore-4.4.14.jar
14:10:03.184 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/elasticsearch-core-7.10.2.jar
14:10:03.184 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- <== RangerPluginClassLoaderUtil.getFilesInDirectory(/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl)
14:10:03.184 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- <== RangerPluginClassLoaderUtil.getPluginFilesForServiceType(): 34 files
14:10:03.184 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- <== RangerPluginClassLoaderUtil.getPluginFilesForServiceTypeAndPluginclass(kms) Pluging Class :org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer
14:10:03.186 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer)
14:10:03.187 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer): calling childClassLoader.findClass()
14:10:03.187 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer)
14:10:03.187 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer): calling childClassLoader().findClass() 
14:10:03.193 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Runnable)
14:10:03.193 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Runnable): calling childClassLoader.findClass()
14:10:03.194 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Runnable): interface java.lang.Runnable
14:10:03.194 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider$KeyACLs)
14:10:03.194 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider$KeyACLs): calling childClassLoader.findClass()
14:10:03.194 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider$KeyACLs)
14:10:03.194 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider$KeyACLs): calling childClassLoader().findClass() 
14:10:03.207 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider$KeyACLs): calling componentClassLoader.findClass()
14:10:03.208 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider$KeyACLs): calling componentClassLoader.loadClass()
14:10:03.209 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider$KeyACLs): interface org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider$KeyACLs
14:10:03.209 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Object)
14:10:03.209 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Object): calling childClassLoader.findClass()
14:10:03.209 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Object): class java.lang.Object
14:10:03.209 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer): class org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer
14:10:03.209 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer): class org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer
14:10:03.210 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Map)
14:10:03.210 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Map): calling childClassLoader.findClass()
14:10:03.210 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Map): interface java.util.Map
14:10:03.210 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Throwable)
14:10:03.210 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Throwable): calling childClassLoader.findClass()
14:10:03.210 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Throwable): class java.lang.Throwable
14:10:03.210 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Exception)
14:10:03.212 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Exception): calling childClassLoader.findClass()
14:10:03.213 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Exception): class java.lang.Exception
14:10:03.214 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authorize.AuthorizationException)
14:10:03.214 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authorize.AuthorizationException): calling childClassLoader.findClass()
14:10:03.214 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.security.authorize.AuthorizationException)
14:10:03.214 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.authorize.AuthorizationException): calling childClassLoader().findClass() 
14:10:03.215 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.authorize.AuthorizationException): calling componentClassLoader.findClass()
14:10:03.215 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authorize.AuthorizationException): calling componentClassLoader.loadClass()
14:10:03.215 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authorize.AuthorizationException): class org.apache.hadoop.security.authorize.AuthorizationException
14:10:03.215 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest)
14:10:03.216 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest): calling childClassLoader.findClass()
14:10:03.216 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest)
14:10:03.224 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest): calling childClassLoader().findClass() 
14:10:03.230 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest): interface org.apache.ranger.plugin.policyengine.RangerAccessRequest
14:10:03.231 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest): interface org.apache.ranger.plugin.policyengine.RangerAccessRequest
14:10:03.231 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.net.UnknownHostException)
14:10:03.231 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.net.UnknownHostException): calling childClassLoader.findClass()
14:10:03.231 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.net.UnknownHostException): class java.net.UnknownHostException
14:10:03.232 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.IOException)
14:10:03.232 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.IOException): calling childClassLoader.findClass()
14:10:03.232 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.IOException): class java.io.IOException
14:10:03.232 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.slf4j.LoggerFactory)
14:10:03.232 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.slf4j.LoggerFactory): calling childClassLoader.findClass()
14:10:03.232 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.slf4j.LoggerFactory)
14:10:03.232 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.slf4j.LoggerFactory): calling childClassLoader().findClass() 
14:10:03.232 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.slf4j.LoggerFactory): calling componentClassLoader.findClass()
14:10:03.233 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.slf4j.LoggerFactory): calling componentClassLoader.loadClass()
14:10:03.233 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.slf4j.LoggerFactory): class org.slf4j.LoggerFactory
14:10:03.233 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPerfTracer)
14:10:03.233 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPerfTracer): calling childClassLoader.findClass()
14:10:03.233 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPerfTracer)
14:10:03.233 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPerfTracer): calling childClassLoader().findClass() 
14:10:03.234 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPerfTracer): class org.apache.ranger.plugin.util.RangerPerfTracer
14:10:03.235 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPerfTracer): class org.apache.ranger.plugin.util.RangerPerfTracer
14:10:03.237 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.StringBuilder)
14:10:03.237 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.StringBuilder): calling childClassLoader.findClass()
14:10:03.237 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.StringBuilder): class java.lang.StringBuilder
14:10:03.237 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.HashMap)
14:10:03.237 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.HashMap): calling childClassLoader.findClass()
14:10:03.237 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.HashMap): class java.util.HashMap
14:10:03.238 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KMSACLsType$Type)
14:10:03.238 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KMSACLsType$Type): calling childClassLoader.findClass()
14:10:03.238 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.crypto.key.kms.server.KMSACLsType$Type)
14:10:03.238 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.crypto.key.kms.server.KMSACLsType$Type): calling childClassLoader().findClass() 
14:10:03.238 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.crypto.key.kms.server.KMSACLsType$Type): calling componentClassLoader.findClass()
14:10:03.238 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KMSACLsType$Type): calling componentClassLoader.loadClass()
14:10:03.239 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KMSACLsType$Type): class org.apache.hadoop.crypto.key.kms.server.KMSACLsType$Type
14:10:03.239 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:03.240 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:03.240 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.conf.Configuration)
14:10:03.240 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.conf.Configuration): calling childClassLoader.findClass()
14:10:03.240 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.conf.Configuration)
14:10:03.240 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.conf.Configuration): calling childClassLoader().findClass() 
14:10:03.240 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.conf.Configuration): calling componentClassLoader.findClass()
14:10:03.240 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.conf.Configuration): calling componentClassLoader.loadClass()
14:10:03.241 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.conf.Configuration): class org.apache.hadoop.conf.Configuration
14:10:03.241 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.slf4j.Logger)
14:10:03.241 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.slf4j.Logger): calling childClassLoader.findClass()
14:10:03.241 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.slf4j.Logger)
14:10:03.241 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.slf4j.Logger): calling childClassLoader().findClass() 
14:10:03.242 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.slf4j.Logger): calling componentClassLoader.findClass()
14:10:03.242 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.slf4j.Logger): calling componentClassLoader.loadClass()
14:10:03.242 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.slf4j.Logger): interface org.slf4j.Logger
14:10:03.242 [main] INFO org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- RangerKmsAuthorizer(conf)...
14:10:03.242 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- Loading ACLs file
14:10:03.242 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.System)
14:10:03.242 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.System): calling childClassLoader.findClass()
14:10:03.242 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.System): class java.lang.System
14:10:03.242 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KMSConfiguration)
14:10:03.242 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KMSConfiguration): calling childClassLoader.findClass()
14:10:03.243 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.crypto.key.kms.server.KMSConfiguration)
14:10:03.243 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.crypto.key.kms.server.KMSConfiguration): calling childClassLoader().findClass() 
14:10:03.243 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.crypto.key.kms.server.KMSConfiguration): calling componentClassLoader.findClass()
14:10:03.243 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KMSConfiguration): calling componentClassLoader.loadClass()
14:10:03.243 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KMSConfiguration): class org.apache.hadoop.crypto.key.kms.server.KMSConfiguration
14:10:03.256 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- Hadoop login
14:10:03.256 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- hadoop login commit
14:10:03.257 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- Using kerberos user: keyadmin@KRBTEST.COM
14:10:03.258 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- Using user: "keyadmin@KRBTEST.COM" with name: keyadmin@KRBTEST.COM
14:10:03.258 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- User entry: "keyadmin@KRBTEST.COM"
14:10:03.259 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- UGI loginUser: keyadmin@KRBTEST.COM (auth:KERBEROS)
14:10:03.261 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(core-default.xml) 
14:10:03.262 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(core-default.xml): calling componentClassLoader.getResources()
14:10:03.262 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(core-default.xml): jar:file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar!/core-default.xml
14:10:03.269 [TGT Renewer for keyadmin@KRBTEST.COM] DEBUG org.apache.hadoop.security.UserGroupInformation -- Current time is 1777903803269, next refresh is 1777972904000
14:10:03.269 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(core-site.xml) 
14:10:03.270 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(core-site.xml): calling componentClassLoader.getResources()
14:10:03.270 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(core-site.xml): file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/core-site.xml
14:10:03.273 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.net.InetAddress)
14:10:03.273 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.net.InetAddress): calling childClassLoader.findClass()
14:10:03.273 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.net.InetAddress): class java.net.InetAddress
14:10:03.286 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.security.SecureClientLogin)
14:10:03.287 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.SecureClientLogin): calling childClassLoader.findClass()
14:10:03.287 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.security.SecureClientLogin)
14:10:03.287 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.SecureClientLogin): calling childClassLoader().findClass() 
14:10:03.288 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.hadoop.security.SecureClientLogin): class org.apache.hadoop.security.SecureClientLogin
14:10:03.288 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.security.SecureClientLogin): class org.apache.hadoop.security.SecureClientLogin
14:10:03.289 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.security.auth.login.LoginException)
14:10:03.289 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.security.auth.login.LoginException): calling childClassLoader.findClass()
14:10:03.289 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.security.auth.login.LoginException): class javax.security.auth.login.LoginException
14:10:03.289 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.security.auth.login.Configuration)
14:10:03.289 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.security.auth.login.Configuration): calling childClassLoader.findClass()
14:10:03.289 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.security.auth.login.Configuration): class javax.security.auth.login.Configuration
14:10:03.289 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.security.SecureClientLoginConfiguration)
14:10:03.289 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.SecureClientLoginConfiguration): calling childClassLoader.findClass()
14:10:03.289 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.security.SecureClientLoginConfiguration)
14:10:03.289 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.SecureClientLoginConfiguration): calling childClassLoader().findClass() 
14:10:03.290 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.hadoop.security.SecureClientLoginConfiguration): class org.apache.hadoop.security.SecureClientLoginConfiguration
14:10:03.291 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.security.SecureClientLoginConfiguration): class org.apache.hadoop.security.SecureClientLoginConfiguration
14:10:03.291 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.security.Principal)
14:10:03.291 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.security.Principal): calling childClassLoader.findClass()
14:10:03.291 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.security.Principal): interface java.security.Principal
14:10:03.292 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Collection)
14:10:03.292 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Collection): calling childClassLoader.findClass()
14:10:03.292 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Collection): interface java.util.Collection
14:10:03.292 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Set)
14:10:03.292 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Set): calling childClassLoader.findClass()
14:10:03.292 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Set): interface java.util.Set
14:10:03.292 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.String)
14:10:03.292 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.String): calling childClassLoader.findClass()
14:10:03.292 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.String): class java.lang.String
14:10:03.293 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- Ranger KMS Principal : rangerkms/127.25.254.212@KRBTEST.COM, Keytab : /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/rangerkms_127.25.254.212@KRBTEST.COM.keytab, NameRule : RULE:[2:$1@$0](rangeradmin@KRBTEST.COM)s/(.*)@KRBTEST.COM/ranger/
RULE:[2:$1@$0](rangertagsync@KRBTEST.COM)s/(.*)@KRBTEST.COM/rangertagsync/
RULE:[2:$1@$0](rangerusersync@KRBTEST.COM)s/(.*)@KRBTEST.COM/rangerusersync/
RULE:[2:$1@$0](rangerkms@KRBTEST.COM)s/(.*)@KRBTEST.COM/keyadmin/
RULE:[2:$1@$0](atlas@KRBTEST.COM)s/(.*)@KRBTEST.COM/atlas/
DEFAULT
14:10:03.293 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.MiscUtil)
14:10:03.293 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.MiscUtil): calling childClassLoader.findClass()
14:10:03.293 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.MiscUtil)
14:10:03.293 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.MiscUtil): calling childClassLoader().findClass() 
14:10:03.295 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.MiscUtil): class org.apache.ranger.audit.provider.MiscUtil
14:10:03.295 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.MiscUtil): class org.apache.ranger.audit.provider.MiscUtil
14:10:03.295 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.ThreadLocal)
14:10:03.295 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.ThreadLocal): calling childClassLoader.findClass()
14:10:03.295 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.ThreadLocal): class java.lang.ThreadLocal
14:10:03.295 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.MiscUtil$1)
14:10:03.295 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.MiscUtil$1): calling childClassLoader.findClass()
14:10:03.295 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.MiscUtil$1)
14:10:03.295 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.MiscUtil$1): calling childClassLoader().findClass() 
14:10:03.296 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.MiscUtil$1): class org.apache.ranger.audit.provider.MiscUtil$1
14:10:03.296 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.MiscUtil$1): class org.apache.ranger.audit.provider.MiscUtil$1
14:10:03.296 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.List)
14:10:03.296 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.List): calling childClassLoader.findClass()
14:10:03.296 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.List): interface java.util.List
14:10:03.296 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.CharSequence)
14:10:03.296 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.CharSequence): calling childClassLoader.findClass()
14:10:03.296 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.CharSequence): interface java.lang.CharSequence
14:10:03.296 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.NumberFormatException)
14:10:03.297 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.NumberFormatException): calling childClassLoader.findClass()
14:10:03.297 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.NumberFormatException): class java.lang.NumberFormatException
14:10:03.297 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.MiscUtil$KerberosConfiguration)
14:10:03.297 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.MiscUtil$KerberosConfiguration): calling childClassLoader.findClass()
14:10:03.297 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.MiscUtil$KerberosConfiguration)
14:10:03.297 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.MiscUtil$KerberosConfiguration): calling childClassLoader().findClass() 
14:10:03.298 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.MiscUtil$KerberosConfiguration): class org.apache.ranger.audit.provider.MiscUtil$KerberosConfiguration
14:10:03.298 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.MiscUtil$KerberosConfiguration): class org.apache.ranger.audit.provider.MiscUtil$KerberosConfiguration
14:10:03.298 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.rmi.dgc.VMID)
14:10:03.298 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.rmi.dgc.VMID): calling childClassLoader.findClass()
14:10:03.299 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.rmi.dgc.VMID): class java.rmi.dgc.VMID
14:10:03.300 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.text.DateFormat)
14:10:03.300 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.text.DateFormat): calling childClassLoader.findClass()
14:10:03.300 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.text.DateFormat): class java.text.DateFormat
14:10:03.300 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.text.SimpleDateFormat)
14:10:03.300 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.text.SimpleDateFormat): calling childClassLoader.findClass()
14:10:03.300 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.text.SimpleDateFormat): class java.text.SimpleDateFormat
14:10:03.300 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Hashtable)
14:10:03.300 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Hashtable): calling childClassLoader.findClass()
14:10:03.300 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Hashtable): class java.util.Hashtable
14:10:03.300 [main] DEBUG org.apache.ranger.audit.provider.MiscUtil -- ==> MiscUtil.initLocalHost()
14:10:03.300 [main] DEBUG org.apache.ranger.audit.provider.MiscUtil -- <== MiscUtil.initLocalHost()
14:10:03.300 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.security.auth.Subject)
14:10:03.300 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.security.auth.Subject): calling childClassLoader.findClass()
14:10:03.300 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.security.auth.Subject): class javax.security.auth.Subject
14:10:03.300 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authentication.util.KerberosName)
14:10:03.300 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authentication.util.KerberosName): calling childClassLoader.findClass()
14:10:03.300 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.security.authentication.util.KerberosName)
14:10:03.300 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.authentication.util.KerberosName): calling childClassLoader().findClass() 
14:10:03.300 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.authentication.util.KerberosName): calling componentClassLoader.findClass()
14:10:03.301 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authentication.util.KerberosName): calling componentClassLoader.loadClass()
14:10:03.301 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authentication.util.KerberosName): class org.apache.hadoop.security.authentication.util.KerberosName
14:10:03.301 [main] INFO org.apache.ranger.audit.provider.MiscUtil -- Creating UGI from keytab directly. keytab=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/rangerkms_127.25.254.212@KRBTEST.COM.keytab, principal=rangerkms/127.25.254.212@KRBTEST.COM
14:10:03.301 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.security.UserGroupInformation)
14:10:03.301 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.UserGroupInformation): calling childClassLoader.findClass()
14:10:03.301 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.security.UserGroupInformation)
14:10:03.301 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.UserGroupInformation): calling childClassLoader().findClass() 
14:10:03.301 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.UserGroupInformation): calling componentClassLoader.findClass()
14:10:03.301 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.UserGroupInformation): calling componentClassLoader.loadClass()
14:10:03.301 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.security.UserGroupInformation): class org.apache.hadoop.security.UserGroupInformation
May 04 14:10:03 dist-test-slave-2x32 krb5kdc[5621](info): AS_REQ (1 etypes {17}) 127.0.0.1: ISSUE: authtime 1777903803, etypes {rep=17 tkt=17 ses=17}, rangerkms/127.25.254.212@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
14:10:03.341 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- Hadoop login
14:10:03.342 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- hadoop login commit
14:10:03.342 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- Using kerberos user: rangerkms/127.25.254.212@KRBTEST.COM
14:10:03.342 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- Using user: "rangerkms/127.25.254.212@KRBTEST.COM" with name: rangerkms/127.25.254.212@KRBTEST.COM
14:10:03.343 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- User entry: "rangerkms/127.25.254.212@KRBTEST.COM"
14:10:03.343 [main] INFO org.apache.ranger.audit.provider.MiscUtil -- Setting UGI=rangerkms/127.25.254.212@KRBTEST.COM (auth:KERBEROS)
14:10:03.343 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authorize.AccessControlList)
14:10:03.343 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authorize.AccessControlList): calling childClassLoader.findClass()
14:10:03.343 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.security.authorize.AccessControlList)
14:10:03.343 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.authorize.AccessControlList): calling childClassLoader().findClass() 
14:10:03.343 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.authorize.AccessControlList): calling componentClassLoader.findClass()
14:10:03.343 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authorize.AccessControlList): calling componentClassLoader.loadClass()
14:10:03.345 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authorize.AccessControlList): class org.apache.hadoop.security.authorize.AccessControlList
14:10:03.348 [main] INFO org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- 'DECRYPT_EEK' Blacklist 'hdfs'
14:10:03.348 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.init()
14:10:03.348 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin)
14:10:03.348 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin): calling childClassLoader.findClass()
14:10:03.348 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin)
14:10:03.348 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin): calling childClassLoader().findClass() 
14:10:03.348 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.service.RangerBasePlugin)
14:10:03.348 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.service.RangerBasePlugin): calling childClassLoader.findClass()
14:10:03.348 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.service.RangerBasePlugin)
14:10:03.349 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.service.RangerBasePlugin): calling childClassLoader().findClass() 
14:10:03.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.service.RangerBasePlugin): class org.apache.ranger.plugin.service.RangerBasePlugin
14:10:03.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.service.RangerBasePlugin): class org.apache.ranger.plugin.service.RangerBasePlugin
14:10:03.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin): class org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin
14:10:03.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin): class org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin
14:10:03.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.RuntimeException)
14:10:03.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.RuntimeException): calling childClassLoader.findClass()
14:10:03.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.RuntimeException): class java.lang.RuntimeException
14:10:03.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.hadoop.config.RangerPluginConfig)
14:10:03.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.hadoop.config.RangerPluginConfig): calling childClassLoader.findClass()
14:10:03.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.hadoop.config.RangerPluginConfig)
14:10:03.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.hadoop.config.RangerPluginConfig): calling childClassLoader().findClass() 
14:10:03.352 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.hadoop.config.RangerConfiguration)
14:10:03.352 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.hadoop.config.RangerConfiguration): calling childClassLoader.findClass()
14:10:03.352 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.hadoop.config.RangerConfiguration)
14:10:03.352 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.hadoop.config.RangerConfiguration): calling childClassLoader().findClass() 
14:10:03.353 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.hadoop.config.RangerConfiguration): class org.apache.ranger.authorization.hadoop.config.RangerConfiguration
14:10:03.353 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.hadoop.config.RangerConfiguration): class org.apache.ranger.authorization.hadoop.config.RangerConfiguration
14:10:03.353 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.hadoop.config.RangerPluginConfig): class org.apache.ranger.authorization.hadoop.config.RangerPluginConfig
14:10:03.353 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.hadoop.config.RangerPluginConfig): class org.apache.ranger.authorization.hadoop.config.RangerPluginConfig
14:10:03.354 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditProviderFactory)
14:10:03.354 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditProviderFactory): calling childClassLoader.findClass()
14:10:03.354 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditProviderFactory)
14:10:03.354 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditProviderFactory): calling childClassLoader().findClass() 
14:10:03.355 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditProviderFactory): class org.apache.ranger.audit.provider.AuditProviderFactory
14:10:03.355 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditProviderFactory): class org.apache.ranger.audit.provider.AuditProviderFactory
14:10:03.355 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.StandAloneAuditProviderFactory)
14:10:03.355 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.StandAloneAuditProviderFactory): calling childClassLoader.findClass()
14:10:03.355 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.StandAloneAuditProviderFactory)
14:10:03.355 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.StandAloneAuditProviderFactory): calling childClassLoader().findClass() 
14:10:03.355 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.StandAloneAuditProviderFactory): class org.apache.ranger.audit.provider.StandAloneAuditProviderFactory
14:10:03.355 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.StandAloneAuditProviderFactory): class org.apache.ranger.audit.provider.StandAloneAuditProviderFactory
14:10:03.356 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngine)
14:10:03.356 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngine): calling childClassLoader.findClass()
14:10:03.356 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngine)
14:10:03.356 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngine): calling childClassLoader().findClass() 
14:10:03.356 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngine): interface org.apache.ranger.plugin.policyengine.RangerPolicyEngine
14:10:03.356 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngine): interface org.apache.ranger.plugin.policyengine.RangerPolicyEngine
14:10:03.357 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.RangerAdminClient)
14:10:03.357 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.RangerAdminClient): calling childClassLoader.findClass()
14:10:03.357 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.RangerAdminClient)
14:10:03.357 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.RangerAdminClient): calling childClassLoader().findClass() 
14:10:03.357 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.RangerAdminClient): interface org.apache.ranger.admin.client.RangerAdminClient
14:10:03.357 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.RangerAdminClient): interface org.apache.ranger.admin.client.RangerAdminClient
14:10:03.357 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.InterruptedException)
14:10:03.357 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.InterruptedException): calling childClassLoader.findClass()
14:10:03.357 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.InterruptedException): class java.lang.InterruptedException
14:10:03.357 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResource)
14:10:03.357 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResource): calling childClassLoader.findClass()
14:10:03.357 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResource)
14:10:03.357 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResource): calling childClassLoader().findClass() 
14:10:03.358 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResource): interface org.apache.ranger.plugin.policyengine.RangerAccessResource
14:10:03.358 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResource): interface org.apache.ranger.plugin.policyengine.RangerAccessResource
14:10:03.358 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResultProcessor)
14:10:03.358 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResultProcessor): calling childClassLoader.findClass()
14:10:03.358 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResultProcessor)
14:10:03.358 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResultProcessor): calling childClassLoader().findClass() 
14:10:03.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResultProcessor): interface org.apache.ranger.plugin.policyengine.RangerAccessResultProcessor
14:10:03.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResultProcessor): interface org.apache.ranger.plugin.policyengine.RangerAccessResultProcessor
14:10:03.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.net.MalformedURLException)
14:10:03.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.net.MalformedURLException): calling childClassLoader.findClass()
14:10:03.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.net.MalformedURLException): class java.net.MalformedURLException
14:10:03.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Collections)
14:10:03.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Collections): calling childClassLoader.findClass()
14:10:03.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Collections): class java.util.Collections
14:10:03.360 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- ==> addResourceIfReadable(ranger-kms-audit.xml)
14:10:03.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.lang.StringUtils)
14:10:03.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.lang.StringUtils): calling childClassLoader.findClass()
14:10:03.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.lang.StringUtils)
14:10:03.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.lang.StringUtils): calling childClassLoader().findClass() 
14:10:03.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.lang.StringUtils): calling componentClassLoader.findClass()
14:10:03.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.lang.StringUtils): calling componentClassLoader.loadClass()
14:10:03.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.lang.StringUtils): class org.apache.commons.lang.StringUtils
14:10:03.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Class)
14:10:03.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Class): calling childClassLoader.findClass()
14:10:03.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Class): class java.lang.Class
14:10:03.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.ClassLoader)
14:10:03.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.ClassLoader): calling childClassLoader.findClass()
14:10:03.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.ClassLoader): class java.lang.ClassLoader
14:10:03.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(ranger-kms-audit.xml) 
14:10:03.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(ranger-kms-audit.xml): calling componentClassLoader.getResources()
14:10:03.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(ranger-kms-audit.xml): null
14:10:03.363 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(/ranger-kms-audit.xml) 
14:10:03.363 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(/ranger-kms-audit.xml): calling componentClassLoader.getResources()
14:10:03.363 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(/ranger-kms-audit.xml): null
14:10:03.363 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.File)
14:10:03.363 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.File): calling childClassLoader.findClass()
14:10:03.363 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.File): class java.io.File
14:10:03.363 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- Conf file path ranger-kms-audit.xml does not exists
14:10:03.363 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- addResourceIfReadable(ranger-kms-audit.xml): couldn't find resource file location
14:10:03.363 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- <== addResourceIfReadable(ranger-kms-audit.xml), result=false
14:10:03.363 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerPluginConfig -- ==> addAuditResource(Service Type: kms
14:10:03.363 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.hadoop.config.RangerLegacyConfigBuilder)
14:10:03.363 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.hadoop.config.RangerLegacyConfigBuilder): calling childClassLoader.findClass()
14:10:03.364 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.hadoop.config.RangerLegacyConfigBuilder)
14:10:03.364 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.hadoop.config.RangerLegacyConfigBuilder): calling childClassLoader().findClass() 
14:10:03.364 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.hadoop.config.RangerLegacyConfigBuilder): class org.apache.ranger.authorization.hadoop.config.RangerLegacyConfigBuilder
14:10:03.364 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.hadoop.config.RangerLegacyConfigBuilder): class org.apache.ranger.authorization.hadoop.config.RangerLegacyConfigBuilder
14:10:03.365 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(hive-site.xml) 
14:10:03.365 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(hive-site.xml): calling componentClassLoader.getResources()
14:10:03.366 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(hive-site.xml): null
14:10:03.366 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(/hive-site.xml) 
14:10:03.366 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(/hive-site.xml): calling componentClassLoader.getResources()
14:10:03.366 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(/hive-site.xml): null
14:10:03.366 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(hbase-site.xml) 
14:10:03.366 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(hbase-site.xml): calling componentClassLoader.getResources()
14:10:03.367 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(hbase-site.xml): null
14:10:03.367 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(/hbase-site.xml) 
14:10:03.367 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(/hbase-site.xml): calling componentClassLoader.getResources()
14:10:03.367 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(/hbase-site.xml): null
14:10:03.367 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(hdfs-site.xml) 
14:10:03.368 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(hdfs-site.xml): calling componentClassLoader.getResources()
14:10:03.368 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(hdfs-site.xml): null
14:10:03.368 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(/hdfs-site.xml) 
14:10:03.368 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(/hdfs-site.xml): calling componentClassLoader.getResources()
14:10:03.369 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(/hdfs-site.xml): null
14:10:03.369 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerPluginConfig -- <== addAuditResource(Service Type: kms)
14:10:03.369 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- ==> addResourceIfReadable(ranger-kms-security.xml)
14:10:03.369 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(ranger-kms-security.xml) 
14:10:03.369 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(ranger-kms-security.xml): calling componentClassLoader.getResources()
14:10:03.369 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(ranger-kms-security.xml): file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ranger-kms-security.xml
14:10:03.369 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- addResourceIfReadable(ranger-kms-security.xml): resource file is file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ranger-kms-security.xml
14:10:03.370 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- <== addResourceIfReadable(ranger-kms-security.xml), result=true
14:10:03.370 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- ==> addResourceIfReadable(ranger-kms-policymgr-ssl.xml)
14:10:03.370 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(ranger-kms-policymgr-ssl.xml) 
14:10:03.370 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(ranger-kms-policymgr-ssl.xml): calling componentClassLoader.getResources()
14:10:03.370 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(ranger-kms-policymgr-ssl.xml): file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ranger-kms-policymgr-ssl.xml
14:10:03.370 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- addResourceIfReadable(ranger-kms-policymgr-ssl.xml): resource file is file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ranger-kms-policymgr-ssl.xml
14:10:03.370 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- <== addResourceIfReadable(ranger-kms-policymgr-ssl.xml), result=true
14:10:03.372 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- ==> addResourceIfReadable(ranger-kms-kms-audit.xml)
14:10:03.372 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(ranger-kms-kms-audit.xml) 
14:10:03.372 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(ranger-kms-kms-audit.xml): calling componentClassLoader.getResources()
14:10:03.373 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(ranger-kms-kms-audit.xml): null
14:10:03.373 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(/ranger-kms-kms-audit.xml) 
14:10:03.373 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(/ranger-kms-kms-audit.xml): calling componentClassLoader.getResources()
14:10:03.373 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(/ranger-kms-kms-audit.xml): null
14:10:03.373 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- Conf file path ranger-kms-kms-audit.xml does not exists
14:10:03.374 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- addResourceIfReadable(ranger-kms-kms-audit.xml): couldn't find resource file location
14:10:03.374 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- <== addResourceIfReadable(ranger-kms-kms-audit.xml), result=false
14:10:03.374 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- ==> addResourceIfReadable(ranger-kms-kms-security.xml)
14:10:03.374 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(ranger-kms-kms-security.xml) 
14:10:03.374 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(ranger-kms-kms-security.xml): calling componentClassLoader.getResources()
14:10:03.374 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(ranger-kms-kms-security.xml): null
14:10:03.374 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(/ranger-kms-kms-security.xml) 
14:10:03.375 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(/ranger-kms-kms-security.xml): calling componentClassLoader.getResources()
14:10:03.375 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(/ranger-kms-kms-security.xml): null
14:10:03.375 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- Conf file path ranger-kms-kms-security.xml does not exists
14:10:03.375 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- addResourceIfReadable(ranger-kms-kms-security.xml): couldn't find resource file location
14:10:03.375 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- <== addResourceIfReadable(ranger-kms-kms-security.xml), result=false
14:10:03.375 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- ==> addResourceIfReadable(ranger-kms-kms-policymgr-ssl.xml)
14:10:03.375 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(ranger-kms-kms-policymgr-ssl.xml) 
14:10:03.375 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(ranger-kms-kms-policymgr-ssl.xml): calling componentClassLoader.getResources()
14:10:03.376 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(ranger-kms-kms-policymgr-ssl.xml): null
14:10:03.376 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(/ranger-kms-kms-policymgr-ssl.xml) 
14:10:03.376 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(/ranger-kms-kms-policymgr-ssl.xml): calling componentClassLoader.getResources()
14:10:03.376 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(/ranger-kms-kms-policymgr-ssl.xml): null
14:10:03.376 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- Conf file path ranger-kms-kms-policymgr-ssl.xml does not exists
14:10:03.376 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- addResourceIfReadable(ranger-kms-kms-policymgr-ssl.xml): couldn't find resource file location
14:10:03.376 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- <== addResourceIfReadable(ranger-kms-kms-policymgr-ssl.xml), result=false
14:10:03.376 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.StringUtil)
14:10:03.376 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.StringUtil): calling childClassLoader.findClass()
14:10:03.376 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.StringUtil)
14:10:03.376 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.StringUtil): calling childClassLoader().findClass() 
14:10:03.377 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.StringUtil): class org.apache.ranger.authorization.utils.StringUtil
14:10:03.377 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.StringUtil): class org.apache.ranger.authorization.utils.StringUtil
14:10:03.378 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.OutputStream)
14:10:03.378 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.OutputStream): calling childClassLoader.findClass()
14:10:03.378 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.OutputStream): class java.io.OutputStream
14:10:03.378 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.ByteArrayOutputStream)
14:10:03.378 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.ByteArrayOutputStream): calling childClassLoader.findClass()
14:10:03.378 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.ByteArrayOutputStream): class java.io.ByteArrayOutputStream
14:10:03.378 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.InputStream)
14:10:03.378 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.InputStream): calling childClassLoader.findClass()
14:10:03.378 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.InputStream): class java.io.InputStream
14:10:03.378 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.ByteArrayInputStream)
14:10:03.378 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.ByteArrayInputStream): calling childClassLoader.findClass()
14:10:03.378 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.ByteArrayInputStream): class java.io.ByteArrayInputStream
14:10:03.378 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.TimeZone)
14:10:03.378 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.TimeZone): calling childClassLoader.findClass()
14:10:03.378 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.TimeZone): class java.util.TimeZone
14:10:03.379 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerPluginConfig -- ranger.plugin.kms.use.x-forwarded-for.ipaddress:false
14:10:03.379 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerPluginConfig -- ranger.plugin.kms.trusted.proxy.ipaddresses:[null]
14:10:03.379 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineOptions)
14:10:03.379 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineOptions): calling childClassLoader.findClass()
14:10:03.379 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineOptions)
14:10:03.379 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineOptions): calling childClassLoader().findClass() 
14:10:03.379 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineOptions): class org.apache.ranger.plugin.policyengine.RangerPolicyEngineOptions
14:10:03.379 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineOptions): class org.apache.ranger.plugin.policyengine.RangerPolicyEngineOptions
14:10:03.380 [main] INFO org.apache.ranger.authorization.hadoop.config.RangerPluginConfig -- PolicyEngineOptions: { evaluatorType: auto, evaluateDelegateAdminOnly: false, disableContextEnrichers: false, disableCustomConditions: false, disableTagPolicyEvaluation: false, disablePolicyRefresher: false, disableTagRetriever: false, disableUserStoreRetriever: false, enableTagEnricherWithLocalRefresher: false, enableUserStoreEnricherWithLocalRefresher: false, disableTrieLookupPrefilter: false, optimizeTrieForRetrieval: false, cacheAuditResult: false, disableRoleResolution: true, optimizeTrieForSpace: false, optimizeTagTrieForRetrieval: false, optimizeTagTrieForSpace: false, enableResourceMatcherReuse: true }
14:10:03.380 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.DownloadTrigger)
14:10:03.380 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.DownloadTrigger): calling childClassLoader.findClass()
14:10:03.380 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.DownloadTrigger)
14:10:03.380 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.DownloadTrigger): calling childClassLoader().findClass() 
14:10:03.381 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.DownloadTrigger): class org.apache.ranger.plugin.util.DownloadTrigger
14:10:03.381 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.DownloadTrigger): class org.apache.ranger.plugin.util.DownloadTrigger
14:10:03.381 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPluginContext)
14:10:03.381 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPluginContext): calling childClassLoader.findClass()
14:10:03.381 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPluginContext)
14:10:03.381 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPluginContext): calling childClassLoader().findClass() 
14:10:03.381 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPluginContext): class org.apache.ranger.plugin.policyengine.RangerPluginContext
14:10:03.381 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPluginContext): class org.apache.ranger.plugin.policyengine.RangerPluginContext
14:10:03.382 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.locks.ReentrantReadWriteLock)
14:10:03.382 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.locks.ReentrantReadWriteLock): calling childClassLoader.findClass()
14:10:03.382 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.locks.ReentrantReadWriteLock): class java.util.concurrent.locks.ReentrantReadWriteLock
14:10:03.382 [main] INFO org.apache.ranger.plugin.service.RangerBasePlugin -- ranger.plugin.kms.null_safe.supplier=v2
14:10:03.382 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject)
14:10:03.382 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject): calling childClassLoader.findClass()
14:10:03.382 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject)
14:10:03.382 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject): calling childClassLoader().findClass() 
14:10:03.383 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.Serializable)
14:10:03.383 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.Serializable): calling childClassLoader.findClass()
14:10:03.383 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.Serializable): interface java.io.Serializable
14:10:03.383 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject): class org.apache.ranger.plugin.model.RangerBaseModelObject
14:10:03.383 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject): class org.apache.ranger.plugin.model.RangerBaseModelObject
14:10:03.383 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplier)
14:10:03.383 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplier): calling childClassLoader.findClass()
14:10:03.383 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplier)
14:10:03.383 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplier): calling childClassLoader().findClass() 
14:10:03.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplier): class org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplier
14:10:03.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplier): class org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplier
14:10:03.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV1)
14:10:03.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV1): calling childClassLoader.findClass()
14:10:03.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV1)
14:10:03.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV1): calling childClassLoader().findClass() 
14:10:03.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV1): class org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV1
14:10:03.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV1): class org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV1
14:10:03.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV2)
14:10:03.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV2): calling childClassLoader.findClass()
14:10:03.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV2)
14:10:03.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV2): calling childClassLoader().findClass() 
14:10:03.385 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV2): class org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV2
14:10:03.385 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV2): class org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV2
14:10:03.385 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.PerfDataRecorder)
14:10:03.385 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.PerfDataRecorder): calling childClassLoader.findClass()
14:10:03.385 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.PerfDataRecorder)
14:10:03.385 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.PerfDataRecorder): calling childClassLoader().findClass() 
14:10:03.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.PerfDataRecorder): class org.apache.ranger.plugin.util.PerfDataRecorder
14:10:03.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.PerfDataRecorder): class org.apache.ranger.plugin.util.PerfDataRecorder
14:10:03.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Thread)
14:10:03.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Thread): calling childClassLoader.findClass()
14:10:03.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Thread): class java.lang.Thread
14:10:03.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.PerfDataRecorder$StatisticsDumper)
14:10:03.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.PerfDataRecorder$StatisticsDumper): calling childClassLoader.findClass()
14:10:03.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.PerfDataRecorder$StatisticsDumper)
14:10:03.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.PerfDataRecorder$StatisticsDumper): calling childClassLoader().findClass() 
14:10:03.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.PerfDataRecorder$StatisticsDumper): class org.apache.ranger.plugin.util.PerfDataRecorder$StatisticsDumper
14:10:03.387 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.PerfDataRecorder$StatisticsDumper): class org.apache.ranger.plugin.util.PerfDataRecorder$StatisticsDumper
14:10:03.387 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.CollectionUtils)
14:10:03.387 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.CollectionUtils): calling childClassLoader.findClass()
14:10:03.387 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.CollectionUtils)
14:10:03.387 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.CollectionUtils): calling childClassLoader().findClass() 
14:10:03.388 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.CollectionUtils): class org.apache.commons.collections.CollectionUtils
14:10:03.388 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.CollectionUtils): class org.apache.commons.collections.CollectionUtils
14:10:03.388 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.IllegalArgumentException)
14:10:03.388 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.IllegalArgumentException): calling childClassLoader.findClass()
14:10:03.388 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.IllegalArgumentException): class java.lang.IllegalArgumentException
14:10:03.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.NullPointerException)
14:10:03.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.NullPointerException): calling childClassLoader.findClass()
14:10:03.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.NullPointerException): class java.lang.NullPointerException
14:10:03.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.IndexOutOfBoundsException)
14:10:03.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.IndexOutOfBoundsException): calling childClassLoader.findClass()
14:10:03.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.IndexOutOfBoundsException): class java.lang.IndexOutOfBoundsException
14:10:03.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Integer)
14:10:03.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Integer): calling childClassLoader.findClass()
14:10:03.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Integer): class java.lang.Integer
14:10:03.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.ArrayList)
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.ArrayList): calling childClassLoader.findClass()
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.ArrayList): class java.util.ArrayList
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.collection.UnmodifiableCollection)
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.collection.UnmodifiableCollection): calling childClassLoader.findClass()
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.collection.UnmodifiableCollection)
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.collection.UnmodifiableCollection): calling childClassLoader().findClass() 
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.Unmodifiable)
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.Unmodifiable): calling childClassLoader.findClass()
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.Unmodifiable)
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.Unmodifiable): calling childClassLoader().findClass() 
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.Unmodifiable): interface org.apache.commons.collections.Unmodifiable
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.Unmodifiable): interface org.apache.commons.collections.Unmodifiable
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.collection.AbstractSerializableCollectionDecorator)
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.collection.AbstractSerializableCollectionDecorator): calling childClassLoader.findClass()
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.collection.AbstractSerializableCollectionDecorator)
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.collection.AbstractSerializableCollectionDecorator): calling childClassLoader().findClass() 
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.collection.AbstractCollectionDecorator)
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.collection.AbstractCollectionDecorator): calling childClassLoader.findClass()
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.collection.AbstractCollectionDecorator)
14:10:03.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.collection.AbstractCollectionDecorator): calling childClassLoader().findClass() 
14:10:03.391 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.collection.AbstractCollectionDecorator): class org.apache.commons.collections.collection.AbstractCollectionDecorator
14:10:03.391 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.collection.AbstractCollectionDecorator): class org.apache.commons.collections.collection.AbstractCollectionDecorator
14:10:03.391 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.collection.AbstractSerializableCollectionDecorator): class org.apache.commons.collections.collection.AbstractSerializableCollectionDecorator
14:10:03.391 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.collection.AbstractSerializableCollectionDecorator): class org.apache.commons.collections.collection.AbstractSerializableCollectionDecorator
14:10:03.391 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.collection.UnmodifiableCollection): class org.apache.commons.collections.collection.UnmodifiableCollection
14:10:03.391 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.collection.UnmodifiableCollection): class org.apache.commons.collections.collection.UnmodifiableCollection
14:10:03.391 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.UnsupportedOperationException)
14:10:03.391 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.UnsupportedOperationException): calling childClassLoader.findClass()
14:10:03.391 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.UnsupportedOperationException): class java.lang.UnsupportedOperationException
14:10:03.391 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerPluginConfig -- superUsers=[], superGroups=[]
14:10:03.391 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerPluginConfig -- auditExcludedUsers=[], auditExcludedGroups=[], auditExcludedRoles=[]
14:10:03.391 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator)
14:10:03.391 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator): calling childClassLoader.findClass()
14:10:03.391 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator)
14:10:03.392 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator): calling childClassLoader().findClass() 
14:10:03.393 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator): class org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator
14:10:03.393 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator): class org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator
14:10:03.393 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.script.ScriptException)
14:10:03.393 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.script.ScriptException): calling childClassLoader.findClass()
14:10:03.393 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(javax.script.ScriptException)
14:10:03.393 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.script.ScriptException): calling childClassLoader().findClass() 
14:10:03.393 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.script.ScriptException): calling componentClassLoader.findClass()
14:10:03.393 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.script.ScriptException): calling componentClassLoader.loadClass()
14:10:03.394 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.script.ScriptException): class javax.script.ScriptException
14:10:03.394 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$1)
14:10:03.394 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$1): calling childClassLoader.findClass()
14:10:03.394 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$1)
14:10:03.394 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$1): calling childClassLoader().findClass() 
14:10:03.394 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$1): class org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$1
14:10:03.394 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$1): class org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$1
14:10:03.394 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Comparator)
14:10:03.394 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Comparator): calling childClassLoader.findClass()
14:10:03.394 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Comparator): interface java.util.Comparator
14:10:03.395 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.text.ParseException)
14:10:03.395 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.text.ParseException): calling childClassLoader.findClass()
14:10:03.395 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.text.ParseException): class java.text.ParseException
14:10:03.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.regex.Pattern)
14:10:03.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.regex.Pattern): calling childClassLoader.findClass()
14:10:03.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.regex.Pattern): class java.util.regex.Pattern
14:10:03.401 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.MacroProcessor)
14:10:03.401 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.MacroProcessor): calling childClassLoader.findClass()
14:10:03.401 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.MacroProcessor)
14:10:03.401 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.MacroProcessor): calling childClassLoader().findClass() 
14:10:03.402 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.MacroProcessor): class org.apache.ranger.plugin.util.MacroProcessor
14:10:03.402 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.MacroProcessor): class org.apache.ranger.plugin.util.MacroProcessor
14:10:03.402 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Iterator)
14:10:03.402 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Iterator): calling childClassLoader.findClass()
14:10:03.402 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Iterator): interface java.util.Iterator
14:10:03.403 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$2)
14:10:03.403 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$2): calling childClassLoader.findClass()
14:10:03.403 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$2)
14:10:03.403 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$2): calling childClassLoader().findClass() 
14:10:03.403 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$2): class org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$2
14:10:03.403 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$2): class org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$2
14:10:03.403 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Arrays)
14:10:03.403 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Arrays): calling childClassLoader.findClass()
14:10:03.403 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Arrays): class java.util.Arrays
14:10:03.404 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditHandler)
14:10:03.404 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditHandler): calling childClassLoader.findClass()
14:10:03.404 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditHandler)
14:10:03.404 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditHandler): calling childClassLoader().findClass() 
14:10:03.404 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditHandler): interface org.apache.ranger.audit.provider.AuditHandler
14:10:03.404 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditHandler): interface org.apache.ranger.audit.provider.AuditHandler
14:10:03.404 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AuditProviderFactory: creating..
14:10:03.405 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.DummyAuditProvider)
14:10:03.405 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.DummyAuditProvider): calling childClassLoader.findClass()
14:10:03.405 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.DummyAuditProvider)
14:10:03.405 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.DummyAuditProvider): calling childClassLoader().findClass() 
14:10:03.405 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.DummyAuditProvider): class org.apache.ranger.audit.provider.DummyAuditProvider
14:10:03.405 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.DummyAuditProvider): class org.apache.ranger.audit.provider.DummyAuditProvider
14:10:03.405 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AuditProviderFactory: initializing..
14:10:03.405 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Properties)
14:10:03.405 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Properties): calling childClassLoader.findClass()
14:10:03.405 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Properties): class java.util.Properties
14:10:03.405 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: ranger.plugin.kms.policy.rest.url=http://127.25.254.212:44445
14:10:03.405 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: xasecure.policymgr.clientssl.keystore=/etc/ranger/kms/conf/ranger-plugin-keystore.jks
14:10:03.405 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: ranger.plugin.kms.policy.rest.client.connection.timeoutMs=120000
14:10:03.405 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: ranger.plugin.kms.policy.rest.client.read.timeoutMs=30000
14:10:03.405 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: xasecure.policymgr.clientssl.truststore=/etc/ranger/kms/conf/ranger-plugin-truststore.jks
14:10:03.406 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: ranger.plugin.kms.policy.pollIntervalMs=30000
14:10:03.406 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: ranger.plugin.kms.service.name=kms
14:10:03.406 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: xasecure.policymgr.clientssl.truststore.credential.file=jceks://file/etc/ranger/kmsdev/cred.jceks
14:10:03.406 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: ranger.plugin.kms.policy.cache.dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/policycache
14:10:03.406 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: ranger.plugin.kms.policy.source.impl=org.apache.ranger.admin.client.RangerAdminRESTClient
14:10:03.406 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: xasecure.policymgr.clientssl.keystore.credential.file=jceks://file/etc/ranger/kmsdev/cred.jceks
14:10:03.406 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: ranger.plugin.kms.policy.rest.ssl.config.file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ranger-kms-policymgr-ssl.xml
14:10:03.406 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- No v3 audit configuration found. Trying v2 audit configurations
14:10:03.406 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditProviderFactory$JVMShutdownHook)
14:10:03.406 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditProviderFactory$JVMShutdownHook): calling childClassLoader.findClass()
14:10:03.406 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditProviderFactory$JVMShutdownHook)
14:10:03.406 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditProviderFactory$JVMShutdownHook): calling childClassLoader().findClass() 
14:10:03.406 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditProviderFactory$JVMShutdownHook): class org.apache.ranger.audit.provider.AuditProviderFactory$JVMShutdownHook
14:10:03.406 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditProviderFactory$JVMShutdownHook): class org.apache.ranger.audit.provider.AuditProviderFactory$JVMShutdownHook
14:10:03.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.Semaphore)
14:10:03.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.Semaphore): calling childClassLoader.findClass()
14:10:03.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.Semaphore): class java.util.concurrent.Semaphore
14:10:03.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicBoolean)
14:10:03.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicBoolean): calling childClassLoader.findClass()
14:10:03.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicBoolean): class java.util.concurrent.atomic.AtomicBoolean
14:10:03.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditProviderFactory$RangerAsyncAuditCleanup)
14:10:03.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditProviderFactory$RangerAsyncAuditCleanup): calling childClassLoader.findClass()
14:10:03.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditProviderFactory$RangerAsyncAuditCleanup)
14:10:03.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditProviderFactory$RangerAsyncAuditCleanup): calling childClassLoader().findClass() 
14:10:03.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditProviderFactory$RangerAsyncAuditCleanup): class org.apache.ranger.audit.provider.AuditProviderFactory$RangerAsyncAuditCleanup
14:10:03.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditProviderFactory$RangerAsyncAuditCleanup): class org.apache.ranger.audit.provider.AuditProviderFactory$RangerAsyncAuditCleanup
14:10:03.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.util.ShutdownHookManager)
14:10:03.408 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.util.ShutdownHookManager): calling childClassLoader.findClass()
14:10:03.408 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.util.ShutdownHookManager)
14:10:03.408 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.util.ShutdownHookManager): calling childClassLoader().findClass() 
14:10:03.408 [Ranger async Audit cleanup] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- RangerAsyncAuditCleanup: Waiting to audit cleanup start signal
14:10:03.408 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.util.ShutdownHookManager): calling componentClassLoader.findClass()
14:10:03.408 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.util.ShutdownHookManager): calling componentClassLoader.loadClass()
14:10:03.409 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.util.ShutdownHookManager): class org.apache.hadoop.util.ShutdownHookManager
14:10:03.415 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(core-default.xml) 
14:10:03.416 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(core-default.xml): calling componentClassLoader.getResources()
14:10:03.416 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(core-default.xml): jar:file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar!/core-default.xml
14:10:03.423 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(core-site.xml) 
14:10:03.423 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(core-site.xml): calling componentClassLoader.getResources()
14:10:03.423 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(core-site.xml): file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/core-site.xml
14:10:03.429 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.PolicyRefresher)
14:10:03.430 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.PolicyRefresher): calling childClassLoader.findClass()
14:10:03.430 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.PolicyRefresher)
14:10:03.430 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.PolicyRefresher): calling childClassLoader().findClass() 
14:10:03.430 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.PolicyRefresher): class org.apache.ranger.plugin.util.PolicyRefresher
14:10:03.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.PolicyRefresher): class org.apache.ranger.plugin.util.PolicyRefresher
14:10:03.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.BlockingQueue)
14:10:03.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.BlockingQueue): calling childClassLoader.findClass()
14:10:03.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.BlockingQueue): interface java.util.concurrent.BlockingQueue
14:10:03.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.Reader)
14:10:03.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.Reader): calling childClassLoader.findClass()
14:10:03.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.Reader): class java.io.Reader
14:10:03.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.FileReader)
14:10:03.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.FileReader): calling childClassLoader.findClass()
14:10:03.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.FileReader): class java.io.FileReader
14:10:03.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerServiceNotFoundException)
14:10:03.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerServiceNotFoundException): calling childClassLoader.findClass()
14:10:03.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerServiceNotFoundException)
14:10:03.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerServiceNotFoundException): calling childClassLoader().findClass() 
14:10:03.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerServiceNotFoundException): class org.apache.ranger.plugin.util.RangerServiceNotFoundException
14:10:03.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerServiceNotFoundException): class org.apache.ranger.plugin.util.RangerServiceNotFoundException
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.IllegalStateException)
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.IllegalStateException): calling childClassLoader.findClass()
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.IllegalStateException): class java.lang.IllegalStateException
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.TimerTask)
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.TimerTask): calling childClassLoader.findClass()
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.TimerTask): class java.util.TimerTask
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.DownloaderTask)
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.DownloaderTask): calling childClassLoader.findClass()
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.DownloaderTask)
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.DownloaderTask): calling childClassLoader().findClass() 
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.DownloaderTask): class org.apache.ranger.plugin.util.DownloaderTask
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.DownloaderTask): class org.apache.ranger.plugin.util.DownloaderTask
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.SecurityException)
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.SecurityException): calling childClassLoader.findClass()
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.SecurityException): class java.lang.SecurityException
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.Writer)
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.Writer): calling childClassLoader.findClass()
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.Writer): class java.io.Writer
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.FileWriter)
14:10:03.432 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.FileWriter): calling childClassLoader.findClass()
14:10:03.433 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.FileWriter): class java.io.FileWriter
14:10:03.434 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.LinkedBlockingQueue)
14:10:03.434 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.LinkedBlockingQueue): calling childClassLoader.findClass()
14:10:03.434 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.LinkedBlockingQueue): class java.util.concurrent.LinkedBlockingQueue
14:10:03.434 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- ==> PolicyRefresher(serviceName=kms).PolicyRefresher()
14:10:03.434 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPluginContext -- ==> RangerBasePlugin.createAdminClient(kms, kms, ranger.plugin.kms)
14:10:03.434 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPluginContext -- Value for property[ranger.plugin.kms.policy.source.impl] was [org.apache.ranger.admin.client.RangerAdminRESTClient].
14:10:03.434 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.RangerAdminRESTClient)
14:10:03.434 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.RangerAdminRESTClient): calling childClassLoader.findClass()
14:10:03.434 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.RangerAdminRESTClient)
14:10:03.434 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.RangerAdminRESTClient): calling childClassLoader().findClass() 
14:10:03.435 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.AbstractRangerAdminClient)
14:10:03.435 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.AbstractRangerAdminClient): calling childClassLoader.findClass()
14:10:03.435 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.AbstractRangerAdminClient)
14:10:03.435 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.AbstractRangerAdminClient): calling childClassLoader().findClass() 
14:10:03.436 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.AbstractRangerAdminClient): class org.apache.ranger.admin.client.AbstractRangerAdminClient
14:10:03.436 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.AbstractRangerAdminClient): class org.apache.ranger.admin.client.AbstractRangerAdminClient
14:10:03.436 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.RangerAdminRESTClient): class org.apache.ranger.admin.client.RangerAdminRESTClient
14:10:03.436 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.RangerAdminRESTClient): class org.apache.ranger.admin.client.RangerAdminRESTClient
14:10:03.436 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.core.type.TypeReference)
14:10:03.436 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.core.type.TypeReference): calling childClassLoader.findClass()
14:10:03.436 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.core.type.TypeReference)
14:10:03.436 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.core.type.TypeReference): calling childClassLoader().findClass() 
14:10:03.436 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.core.type.TypeReference): calling componentClassLoader.findClass()
14:10:03.436 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.core.type.TypeReference): calling componentClassLoader.loadClass()
14:10:03.437 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.core.type.TypeReference): class com.fasterxml.jackson.core.type.TypeReference
14:10:03.437 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.RangerAdminRESTClient$1)
14:10:03.437 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.RangerAdminRESTClient$1): calling childClassLoader.findClass()
14:10:03.437 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.RangerAdminRESTClient$1)
14:10:03.437 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.RangerAdminRESTClient$1): calling childClassLoader().findClass() 
14:10:03.438 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.RangerAdminRESTClient$1): class org.apache.ranger.admin.client.RangerAdminRESTClient$1
14:10:03.438 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.RangerAdminRESTClient$1): class org.apache.ranger.admin.client.RangerAdminRESTClient$1
14:10:03.438 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.UnsupportedEncodingException)
14:10:03.438 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.UnsupportedEncodingException): calling childClassLoader.findClass()
14:10:03.438 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.UnsupportedEncodingException): class java.io.UnsupportedEncodingException
14:10:03.438 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.security.AccessControlException)
14:10:03.438 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.AccessControlException): calling childClassLoader.findClass()
14:10:03.438 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.security.AccessControlException)
14:10:03.438 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.AccessControlException): calling childClassLoader().findClass() 
14:10:03.438 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.AccessControlException): calling componentClassLoader.findClass()
14:10:03.438 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.AccessControlException): calling componentClassLoader.loadClass()
14:10:03.438 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.security.AccessControlException): class org.apache.hadoop.security.AccessControlException
14:10:03.439 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.ws.rs.core.Cookie)
14:10:03.439 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.ws.rs.core.Cookie): calling childClassLoader.findClass()
14:10:03.439 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(javax.ws.rs.core.Cookie)
14:10:03.439 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.ws.rs.core.Cookie): calling childClassLoader().findClass() 
14:10:03.439 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.ws.rs.core.Cookie): calling componentClassLoader.findClass()
14:10:03.439 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.ws.rs.core.Cookie): calling componentClassLoader.loadClass()
14:10:03.440 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.ws.rs.core.Cookie): class javax.ws.rs.core.Cookie
14:10:03.440 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.ws.rs.core.NewCookie)
14:10:03.440 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.ws.rs.core.NewCookie): calling childClassLoader.findClass()
14:10:03.440 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(javax.ws.rs.core.NewCookie)
14:10:03.440 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.ws.rs.core.NewCookie): calling childClassLoader().findClass() 
14:10:03.440 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.ws.rs.core.NewCookie): calling componentClassLoader.findClass()
14:10:03.440 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.ws.rs.core.NewCookie): calling componentClassLoader.loadClass()
14:10:03.441 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.ws.rs.core.NewCookie): class javax.ws.rs.core.NewCookie
14:10:03.442 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRESTUtils)
14:10:03.442 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRESTUtils): calling childClassLoader.findClass()
14:10:03.442 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRESTUtils)
14:10:03.442 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRESTUtils): calling childClassLoader().findClass() 
14:10:03.442 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRESTUtils): class org.apache.ranger.plugin.util.RangerRESTUtils
14:10:03.442 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRESTUtils): class org.apache.ranger.plugin.util.RangerRESTUtils
14:10:03.443 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.kstruct.gethostname4j.Hostname)
14:10:03.443 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.kstruct.gethostname4j.Hostname): calling childClassLoader.findClass()
14:10:03.443 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.kstruct.gethostname4j.Hostname)
14:10:03.443 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.kstruct.gethostname4j.Hostname): calling childClassLoader().findClass() 
14:10:03.443 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.kstruct.gethostname4j.Hostname): class com.kstruct.gethostname4j.Hostname
14:10:03.443 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.kstruct.gethostname4j.Hostname): class com.kstruct.gethostname4j.Hostname
14:10:03.443 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Platform)
14:10:03.443 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Platform): calling childClassLoader.findClass()
14:10:03.443 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Platform)
14:10:03.443 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Platform): calling childClassLoader().findClass() 
14:10:03.444 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Platform): class com.sun.jna.Platform
14:10:03.444 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Platform): class com.sun.jna.Platform
14:10:03.444 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.ClassNotFoundException)
14:10:03.444 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.ClassNotFoundException): calling childClassLoader.findClass()
14:10:03.444 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.ClassNotFoundException): class java.lang.ClassNotFoundException
14:10:03.444 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.Buffer)
14:10:03.444 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.Buffer): calling childClassLoader.findClass()
14:10:03.444 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.Buffer): class java.nio.Buffer
14:10:03.444 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.kstruct.gethostname4j.Hostname$UnixCLibrary)
14:10:03.444 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.kstruct.gethostname4j.Hostname$UnixCLibrary): calling childClassLoader.findClass()
14:10:03.444 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.kstruct.gethostname4j.Hostname$UnixCLibrary)
14:10:03.444 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.kstruct.gethostname4j.Hostname$UnixCLibrary): calling childClassLoader().findClass() 
14:10:03.445 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Library)
14:10:03.445 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Library): calling childClassLoader.findClass()
14:10:03.445 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Library)
14:10:03.445 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Library): calling childClassLoader().findClass() 
14:10:03.445 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Library): interface com.sun.jna.Library
14:10:03.445 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Library): interface com.sun.jna.Library
14:10:03.445 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.kstruct.gethostname4j.Hostname$UnixCLibrary): interface com.kstruct.gethostname4j.Hostname$UnixCLibrary
14:10:03.445 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.kstruct.gethostname4j.Hostname$UnixCLibrary): interface com.kstruct.gethostname4j.Hostname$UnixCLibrary
14:10:03.445 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Native)
14:10:03.445 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Native): calling childClassLoader.findClass()
14:10:03.445 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Native)
14:10:03.445 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Native): calling childClassLoader().findClass() 
14:10:03.447 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Version)
14:10:03.447 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Version): calling childClassLoader.findClass()
14:10:03.447 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Version)
14:10:03.447 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Version): calling childClassLoader().findClass() 
14:10:03.447 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Version): interface com.sun.jna.Version
14:10:03.447 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Version): interface com.sun.jna.Version
14:10:03.447 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Native): class com.sun.jna.Native
14:10:03.447 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Native): class com.sun.jna.Native
14:10:03.447 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Callback$UncaughtExceptionHandler)
14:10:03.447 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Callback$UncaughtExceptionHandler): calling childClassLoader.findClass()
14:10:03.447 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Callback$UncaughtExceptionHandler)
14:10:03.447 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Callback$UncaughtExceptionHandler): calling childClassLoader().findClass() 
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Callback$UncaughtExceptionHandler): interface com.sun.jna.Callback$UncaughtExceptionHandler
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Callback$UncaughtExceptionHandler): interface com.sun.jna.Callback$UncaughtExceptionHandler
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Error)
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Error): calling childClassLoader.findClass()
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Error): class java.lang.Error
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Native$7)
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Native$7): calling childClassLoader.findClass()
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Native$7)
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Native$7): calling childClassLoader().findClass() 
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Native$7): class com.sun.jna.Native$7
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Native$7): class com.sun.jna.Native$7
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.reflect.InvocationHandler)
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.reflect.InvocationHandler): calling childClassLoader.findClass()
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.reflect.InvocationHandler): interface java.lang.reflect.InvocationHandler
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.NoSuchMethodError)
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.NoSuchMethodError): calling childClassLoader.findClass()
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.NoSuchMethodError): class java.lang.NoSuchMethodError
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.UnsatisfiedLinkError)
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.UnsatisfiedLinkError): calling childClassLoader.findClass()
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.UnsatisfiedLinkError): class java.lang.UnsatisfiedLinkError
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.charset.IllegalCharsetNameException)
14:10:03.448 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.charset.IllegalCharsetNameException): calling childClassLoader.findClass()
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.charset.IllegalCharsetNameException): class java.nio.charset.IllegalCharsetNameException
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.charset.UnsupportedCharsetException)
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.charset.UnsupportedCharsetException): calling childClassLoader.findClass()
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.charset.UnsupportedCharsetException): class java.nio.charset.UnsupportedCharsetException
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.NoSuchFieldException)
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.NoSuchFieldException): calling childClassLoader.findClass()
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.NoSuchFieldException): class java.lang.NoSuchFieldException
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.net.URISyntaxException)
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.net.URISyntaxException): calling childClassLoader.findClass()
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.net.URISyntaxException): class java.net.URISyntaxException
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.security.PrivilegedAction)
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.security.PrivilegedAction): calling childClassLoader.findClass()
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.security.PrivilegedAction): interface java.security.PrivilegedAction
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.FilenameFilter)
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.FilenameFilter): calling childClassLoader.findClass()
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.FilenameFilter): interface java.io.FilenameFilter
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.FromNativeContext)
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.FromNativeContext): calling childClassLoader.findClass()
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.FromNativeContext)
14:10:03.449 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.FromNativeContext): calling childClassLoader().findClass() 
14:10:03.450 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.FromNativeContext): class com.sun.jna.FromNativeContext
14:10:03.450 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.FromNativeContext): class com.sun.jna.FromNativeContext
14:10:03.450 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.MethodResultContext)
14:10:03.450 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.MethodResultContext): calling childClassLoader.findClass()
14:10:03.450 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.MethodResultContext)
14:10:03.450 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.MethodResultContext): calling childClassLoader().findClass() 
14:10:03.450 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.FunctionResultContext)
14:10:03.450 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.FunctionResultContext): calling childClassLoader.findClass()
14:10:03.450 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.FunctionResultContext)
14:10:03.450 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.FunctionResultContext): calling childClassLoader().findClass() 
14:10:03.450 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.FunctionResultContext): class com.sun.jna.FunctionResultContext
14:10:03.450 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.FunctionResultContext): class com.sun.jna.FunctionResultContext
14:10:03.450 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.MethodResultContext): class com.sun.jna.MethodResultContext
14:10:03.450 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.MethodResultContext): class com.sun.jna.MethodResultContext
14:10:03.451 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.logging.Logger)
14:10:03.451 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.logging.Logger): calling childClassLoader.findClass()
14:10:03.451 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.logging.Logger): class java.util.logging.Logger
14:10:03.451 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.charset.Charset)
14:10:03.451 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.charset.Charset): calling childClassLoader.findClass()
14:10:03.451 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.charset.Charset): class java.nio.charset.Charset
14:10:03.451 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Boolean)
14:10:03.451 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Boolean): calling childClassLoader.findClass()
14:10:03.451 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Boolean): class java.lang.Boolean
14:10:03.451 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.logging.Level)
14:10:03.451 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.logging.Level): calling childClassLoader.findClass()
14:10:03.451 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.logging.Level): class java.util.logging.Level
14:10:03.451 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.WeakHashMap)
14:10:03.451 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.WeakHashMap): calling childClassLoader.findClass()
14:10:03.451 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.WeakHashMap): class java.util.WeakHashMap
14:10:03.451 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Native$1)
14:10:03.451 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Native$1): calling childClassLoader.findClass()
14:10:03.451 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Native$1)
14:10:03.451 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Native$1): calling childClassLoader().findClass() 
14:10:03.452 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Native$1): class com.sun.jna.Native$1
14:10:03.452 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Native$1): class com.sun.jna.Native$1
14:10:03.452 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Native$5)
14:10:03.452 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Native$5): calling childClassLoader.findClass()
14:10:03.452 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Native$5)
14:10:03.452 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Native$5): calling childClassLoader().findClass() 
14:10:03.452 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Native$5): class com.sun.jna.Native$5
14:10:03.452 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Native$5): class com.sun.jna.Native$5
14:10:03.453 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(com/sun/jna/linux-x86-64/libjnidispatch.so) 
14:10:03.453 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(com/sun/jna/linux-x86-64/libjnidispatch.so): jar:file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/jna-5.7.0.jar!/com/sun/jna/linux-x86-64/libjnidispatch.so
14:10:03.453 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.net.URL)
14:10:03.453 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.net.URL): calling childClassLoader.findClass()
14:10:03.453 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.net.URL): class java.net.URL
14:10:03.453 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(com/sun/jna/linux-x86-64/libjnidispatch.so) 
14:10:03.453 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(com/sun/jna/linux-x86-64/libjnidispatch.so): jar:file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/jna-5.7.0.jar!/com/sun/jna/linux-x86-64/libjnidispatch.so
14:10:03.454 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.FileOutputStream)
14:10:03.454 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.FileOutputStream): calling childClassLoader.findClass()
14:10:03.454 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.FileOutputStream): class java.io.FileOutputStream
14:10:03.457 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.reflect.Method)
14:10:03.457 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.reflect.Method): calling childClassLoader.findClass()
14:10:03.457 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.reflect.Method): class java.lang.reflect.Method
14:10:03.457 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.ByteBuffer)
14:10:03.457 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.ByteBuffer): calling childClassLoader.findClass()
14:10:03.457 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.ByteBuffer): class java.nio.ByteBuffer
14:10:03.457 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.CharBuffer)
14:10:03.457 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.CharBuffer): calling childClassLoader.findClass()
14:10:03.457 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.CharBuffer): class java.nio.CharBuffer
14:10:03.457 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.ShortBuffer)
14:10:03.457 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.ShortBuffer): calling childClassLoader.findClass()
14:10:03.458 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.ShortBuffer): class java.nio.ShortBuffer
14:10:03.458 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.IntBuffer)
14:10:03.458 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.IntBuffer): calling childClassLoader.findClass()
14:10:03.458 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.IntBuffer): class java.nio.IntBuffer
14:10:03.458 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.LongBuffer)
14:10:03.458 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.LongBuffer): calling childClassLoader.findClass()
14:10:03.458 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.LongBuffer): class java.nio.LongBuffer
14:10:03.458 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.FloatBuffer)
14:10:03.458 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.FloatBuffer): calling childClassLoader.findClass()
14:10:03.458 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.FloatBuffer): class java.nio.FloatBuffer
14:10:03.458 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.DoubleBuffer)
14:10:03.458 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.DoubleBuffer): calling childClassLoader.findClass()
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.DoubleBuffer): class java.nio.DoubleBuffer
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Void)
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Void): calling childClassLoader.findClass()
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Void): class java.lang.Void
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Byte)
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Byte): calling childClassLoader.findClass()
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Byte): class java.lang.Byte
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Character)
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Character): calling childClassLoader.findClass()
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Character): class java.lang.Character
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Short)
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Short): calling childClassLoader.findClass()
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Short): class java.lang.Short
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Long)
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Long): calling childClassLoader.findClass()
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Long): class java.lang.Long
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Float)
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Float): calling childClassLoader.findClass()
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Float): class java.lang.Float
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Double)
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Double): calling childClassLoader.findClass()
14:10:03.459 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Double): class java.lang.Double
14:10:03.460 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Pointer)
14:10:03.460 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Pointer): calling childClassLoader.findClass()
14:10:03.460 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Pointer)
14:10:03.460 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Pointer): calling childClassLoader().findClass() 
14:10:03.461 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Pointer): class com.sun.jna.Pointer
14:10:03.461 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Pointer): class com.sun.jna.Pointer
14:10:03.461 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Pointer$Opaque)
14:10:03.461 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Pointer$Opaque): calling childClassLoader.findClass()
14:10:03.461 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Pointer$Opaque)
14:10:03.461 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Pointer$Opaque): calling childClassLoader().findClass() 
14:10:03.462 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Pointer$Opaque): class com.sun.jna.Pointer$Opaque
14:10:03.462 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Pointer$Opaque): class com.sun.jna.Pointer$Opaque
14:10:03.462 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.StringWriter)
14:10:03.462 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.StringWriter): calling childClassLoader.findClass()
14:10:03.462 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.StringWriter): class java.io.StringWriter
14:10:03.462 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Structure)
14:10:03.462 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Structure): calling childClassLoader.findClass()
14:10:03.462 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Structure)
14:10:03.462 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Structure): calling childClassLoader().findClass() 
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Structure): class com.sun.jna.Structure
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Structure): class com.sun.jna.Structure
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Structure$1)
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Structure$1): calling childClassLoader.findClass()
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Structure$1)
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Structure$1): calling childClassLoader().findClass() 
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Structure$1): class com.sun.jna.Structure$1
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Structure$1): class com.sun.jna.Structure$1
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Structure$2)
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Structure$2): calling childClassLoader.findClass()
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Structure$2)
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Structure$2): calling childClassLoader().findClass() 
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Structure$2): class com.sun.jna.Structure$2
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Structure$2): class com.sun.jna.Structure$2
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Structure$3)
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Structure$3): calling childClassLoader.findClass()
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Structure$3)
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Structure$3): calling childClassLoader().findClass() 
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Structure$3): class com.sun.jna.Structure$3
14:10:03.464 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Structure$3): class com.sun.jna.Structure$3
14:10:03.465 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Memory)
14:10:03.465 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Memory): calling childClassLoader.findClass()
14:10:03.465 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Memory)
14:10:03.465 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Memory): calling childClassLoader().findClass() 
14:10:03.465 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Memory): class com.sun.jna.Memory
14:10:03.465 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Memory): class com.sun.jna.Memory
14:10:03.465 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.InstantiationException)
14:10:03.465 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.InstantiationException): calling childClassLoader.findClass()
14:10:03.465 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.InstantiationException): class java.lang.InstantiationException
14:10:03.465 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.IllegalAccessException)
14:10:03.465 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.IllegalAccessException): calling childClassLoader.findClass()
14:10:03.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.IllegalAccessException): class java.lang.IllegalAccessException
14:10:03.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.reflect.InvocationTargetException)
14:10:03.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.reflect.InvocationTargetException): calling childClassLoader.findClass()
14:10:03.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.reflect.InvocationTargetException): class java.lang.reflect.InvocationTargetException
14:10:03.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.NoSuchMethodException)
14:10:03.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.NoSuchMethodException): calling childClassLoader.findClass()
14:10:03.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.NoSuchMethodException): class java.lang.NoSuchMethodException
14:10:03.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Structure$AutoAllocated)
14:10:03.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Structure$AutoAllocated): calling childClassLoader.findClass()
14:10:03.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Structure$AutoAllocated)
14:10:03.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Structure$AutoAllocated): calling childClassLoader().findClass() 
14:10:03.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Structure$AutoAllocated): class com.sun.jna.Structure$AutoAllocated
14:10:03.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Structure$AutoAllocated): class com.sun.jna.Structure$AutoAllocated
14:10:03.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.ToNativeContext)
14:10:03.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.ToNativeContext): calling childClassLoader.findClass()
14:10:03.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.ToNativeContext)
14:10:03.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.ToNativeContext): calling childClassLoader().findClass() 
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.ToNativeContext): class com.sun.jna.ToNativeContext
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.ToNativeContext): class com.sun.jna.ToNativeContext
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.StructureWriteContext)
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.StructureWriteContext): calling childClassLoader.findClass()
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.StructureWriteContext)
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.StructureWriteContext): calling childClassLoader().findClass() 
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.StructureWriteContext): class com.sun.jna.StructureWriteContext
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.StructureWriteContext): class com.sun.jna.StructureWriteContext
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.ToNativeConverter)
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.ToNativeConverter): calling childClassLoader.findClass()
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.ToNativeConverter)
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.ToNativeConverter): calling childClassLoader().findClass() 
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.ToNativeConverter): interface com.sun.jna.ToNativeConverter
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.ToNativeConverter): interface com.sun.jna.ToNativeConverter
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.FromNativeConverter)
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.FromNativeConverter): calling childClassLoader.findClass()
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.FromNativeConverter)
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.FromNativeConverter): calling childClassLoader().findClass() 
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.FromNativeConverter): interface com.sun.jna.FromNativeConverter
14:10:03.467 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.FromNativeConverter): interface com.sun.jna.FromNativeConverter
14:10:03.468 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.StructureReadContext)
14:10:03.468 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.StructureReadContext): calling childClassLoader.findClass()
14:10:03.468 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.StructureReadContext)
14:10:03.468 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.StructureReadContext): calling childClassLoader().findClass() 
14:10:03.468 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.StructureReadContext): class com.sun.jna.StructureReadContext
14:10:03.468 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.StructureReadContext): class com.sun.jna.StructureReadContext
14:10:03.468 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Structure$ByValue)
14:10:03.468 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Structure$ByValue): calling childClassLoader.findClass()
14:10:03.468 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Structure$ByValue)
14:10:03.468 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Structure$ByValue): calling childClassLoader().findClass() 
14:10:03.468 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Structure$ByValue): interface com.sun.jna.Structure$ByValue
14:10:03.468 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Structure$ByValue): interface com.sun.jna.Structure$ByValue
14:10:03.468 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Callback)
14:10:03.468 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Callback): calling childClassLoader.findClass()
14:10:03.468 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Callback)
14:10:03.469 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Callback): calling childClassLoader().findClass() 
14:10:03.469 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Callback): interface com.sun.jna.Callback
14:10:03.469 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Callback): interface com.sun.jna.Callback
14:10:03.469 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.CallbackReference$AttachOptions)
14:10:03.469 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.CallbackReference$AttachOptions): calling childClassLoader.findClass()
14:10:03.469 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.CallbackReference$AttachOptions)
14:10:03.469 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.CallbackReference$AttachOptions): calling childClassLoader().findClass() 
14:10:03.469 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.CallbackReference$AttachOptions): class com.sun.jna.CallbackReference$AttachOptions
14:10:03.469 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.CallbackReference$AttachOptions): class com.sun.jna.CallbackReference$AttachOptions
14:10:03.469 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.CallbackReference)
14:10:03.469 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.CallbackReference): calling childClassLoader.findClass()
14:10:03.469 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.CallbackReference)
14:10:03.469 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.CallbackReference): calling childClassLoader().findClass() 
14:10:03.470 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.ref.WeakReference)
14:10:03.470 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.ref.WeakReference): calling childClassLoader.findClass()
14:10:03.470 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.ref.WeakReference): class java.lang.ref.WeakReference
14:10:03.471 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.CallbackReference): class com.sun.jna.CallbackReference
14:10:03.471 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.CallbackReference): class com.sun.jna.CallbackReference
14:10:03.471 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.CallbackProxy)
14:10:03.471 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.CallbackProxy): calling childClassLoader.findClass()
14:10:03.471 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.CallbackProxy)
14:10:03.471 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.CallbackProxy): calling childClassLoader().findClass() 
14:10:03.471 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.CallbackProxy): interface com.sun.jna.CallbackProxy
14:10:03.471 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.CallbackProxy): interface com.sun.jna.CallbackProxy
14:10:03.471 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.ref.Reference)
14:10:03.471 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.ref.Reference): calling childClassLoader.findClass()
14:10:03.471 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.ref.Reference): class java.lang.ref.Reference
14:10:03.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.WString)
14:10:03.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.WString): calling childClassLoader.findClass()
14:10:03.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.WString)
14:10:03.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.WString): calling childClassLoader().findClass() 
14:10:03.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Comparable)
14:10:03.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Comparable): calling childClassLoader.findClass()
14:10:03.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Comparable): interface java.lang.Comparable
14:10:03.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.WString): class com.sun.jna.WString
14:10:03.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.WString): class com.sun.jna.WString
14:10:03.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.NativeMapped)
14:10:03.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.NativeMapped): calling childClassLoader.findClass()
14:10:03.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.NativeMapped)
14:10:03.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.NativeMapped): calling childClassLoader().findClass() 
14:10:03.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.NativeMapped): interface com.sun.jna.NativeMapped
14:10:03.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.NativeMapped): interface com.sun.jna.NativeMapped
14:10:03.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.IntegerType)
14:10:03.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.IntegerType): calling childClassLoader.findClass()
14:10:03.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.IntegerType)
14:10:03.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.IntegerType): calling childClassLoader().findClass() 
14:10:03.473 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Number)
14:10:03.473 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Number): calling childClassLoader.findClass()
14:10:03.473 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Number): class java.lang.Number
14:10:03.473 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.IntegerType): class com.sun.jna.IntegerType
14:10:03.473 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.IntegerType): class com.sun.jna.IntegerType
14:10:03.473 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.PointerType)
14:10:03.473 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.PointerType): calling childClassLoader.findClass()
14:10:03.473 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.PointerType)
14:10:03.473 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.PointerType): calling childClassLoader().findClass() 
14:10:03.473 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.PointerType): class com.sun.jna.PointerType
14:10:03.473 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.PointerType): class com.sun.jna.PointerType
14:10:03.473 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.JNIEnv)
14:10:03.473 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.JNIEnv): calling childClassLoader.findClass()
14:10:03.473 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.JNIEnv)
14:10:03.473 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.JNIEnv): calling childClassLoader().findClass() 
14:10:03.474 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.JNIEnv): class com.sun.jna.JNIEnv
14:10:03.474 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.JNIEnv): class com.sun.jna.JNIEnv
14:10:03.474 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Native$ffi_callback)
14:10:03.474 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Native$ffi_callback): calling childClassLoader.findClass()
14:10:03.474 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Native$ffi_callback)
14:10:03.474 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Native$ffi_callback): calling childClassLoader().findClass() 
14:10:03.474 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Native$ffi_callback): interface com.sun.jna.Native$ffi_callback
14:10:03.474 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Native$ffi_callback): interface com.sun.jna.Native$ffi_callback
14:10:03.474 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Structure$FFIType$FFITypes)
14:10:03.474 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Structure$FFIType$FFITypes): calling childClassLoader.findClass()
14:10:03.474 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Structure$FFIType$FFITypes)
14:10:03.474 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Structure$FFIType$FFITypes): calling childClassLoader().findClass() 
14:10:03.474 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Structure$FFIType$FFITypes): class com.sun.jna.Structure$FFIType$FFITypes
14:10:03.474 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Structure$FFIType$FFITypes): class com.sun.jna.Structure$FFIType$FFITypes
14:10:03.474 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Native$2)
14:10:03.474 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Native$2): calling childClassLoader.findClass()
14:10:03.474 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Native$2)
14:10:03.474 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Native$2): calling childClassLoader().findClass() 
14:10:03.475 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Native$2): class com.sun.jna.Native$2
14:10:03.475 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Native$2): class com.sun.jna.Native$2
14:10:03.475 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Library$Handler)
14:10:03.475 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Library$Handler): calling childClassLoader.findClass()
14:10:03.475 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Library$Handler)
14:10:03.475 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Library$Handler): calling childClassLoader().findClass() 
14:10:03.475 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Library$Handler): class com.sun.jna.Library$Handler
14:10:03.475 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Library$Handler): class com.sun.jna.Library$Handler
14:10:03.475 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.AltCallingConvention)
14:10:03.475 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.AltCallingConvention): calling childClassLoader.findClass()
14:10:03.475 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.AltCallingConvention)
14:10:03.475 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.AltCallingConvention): calling childClassLoader().findClass() 
14:10:03.476 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.AltCallingConvention): interface com.sun.jna.AltCallingConvention
14:10:03.476 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.AltCallingConvention): interface com.sun.jna.AltCallingConvention
14:10:03.476 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.NativeLibrary)
14:10:03.476 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.NativeLibrary): calling childClassLoader.findClass()
14:10:03.476 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.NativeLibrary)
14:10:03.476 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.NativeLibrary): calling childClassLoader().findClass() 
14:10:03.477 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.NativeLibrary): class com.sun.jna.NativeLibrary
14:10:03.477 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.NativeLibrary): class com.sun.jna.NativeLibrary
14:10:03.477 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.InputStreamReader)
14:10:03.477 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.InputStreamReader): calling childClassLoader.findClass()
14:10:03.477 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.InputStreamReader): class java.io.InputStreamReader
14:10:03.478 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.LinkedHashSet)
14:10:03.478 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.LinkedHashSet): calling childClassLoader.findClass()
14:10:03.478 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.LinkedHashSet): class java.util.LinkedHashSet
14:10:03.478 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Runtime)
14:10:03.478 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Runtime): calling childClassLoader.findClass()
14:10:03.478 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Runtime): class java.lang.Runtime
14:10:03.481 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.BufferedReader)
14:10:03.481 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.BufferedReader): calling childClassLoader.findClass()
14:10:03.481 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.BufferedReader): class java.io.BufferedReader
14:10:03.481 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Process)
14:10:03.481 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Process): calling childClassLoader.findClass()
14:10:03.481 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Process): class java.lang.Process
14:10:03.483 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.StringTokenizer)
14:10:03.483 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.StringTokenizer): calling childClassLoader.findClass()
14:10:03.483 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.StringTokenizer): class java.util.StringTokenizer
14:10:03.484 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.reflect.Proxy)
14:10:03.484 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.reflect.Proxy): calling childClassLoader.findClass()
14:10:03.484 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.reflect.Proxy): class java.lang.reflect.Proxy
14:10:03.485 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.NoClassDefFoundError)
14:10:03.485 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.NoClassDefFoundError): calling childClassLoader.findClass()
14:10:03.485 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.NoClassDefFoundError): class java.lang.NoClassDefFoundError
14:10:03.485 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.reflect.UndeclaredThrowableException)
14:10:03.485 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.reflect.UndeclaredThrowableException): calling childClassLoader.findClass()
14:10:03.485 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.reflect.UndeclaredThrowableException): class java.lang.reflect.UndeclaredThrowableException
14:10:03.486 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.internal.ReflectionUtils)
14:10:03.486 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.internal.ReflectionUtils): calling childClassLoader.findClass()
14:10:03.486 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.internal.ReflectionUtils)
14:10:03.486 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.internal.ReflectionUtils): calling childClassLoader().findClass() 
14:10:03.487 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.internal.ReflectionUtils): class com.sun.jna.internal.ReflectionUtils
14:10:03.487 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.internal.ReflectionUtils): class com.sun.jna.internal.ReflectionUtils
14:10:03.487 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.AssertionError)
14:10:03.487 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.AssertionError): calling childClassLoader.findClass()
14:10:03.487 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.AssertionError): class java.lang.AssertionError
14:10:03.487 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.invoke.MethodHandles)
14:10:03.487 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.invoke.MethodHandles): calling childClassLoader.findClass()
14:10:03.487 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.invoke.MethodHandles): class java.lang.invoke.MethodHandles
14:10:03.487 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.invoke.MethodHandle)
14:10:03.487 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.invoke.MethodHandle): calling childClassLoader.findClass()
14:10:03.487 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.invoke.MethodHandle): class java.lang.invoke.MethodHandle
14:10:03.487 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.invoke.MethodHandles$Lookup)
14:10:03.487 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.invoke.MethodHandles$Lookup): calling childClassLoader.findClass()
14:10:03.487 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.invoke.MethodHandles$Lookup): class java.lang.invoke.MethodHandles$Lookup
14:10:03.487 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.invoke.MethodType)
14:10:03.487 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.invoke.MethodType): calling childClassLoader.findClass()
14:10:03.488 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.invoke.MethodType): class java.lang.invoke.MethodType
14:10:03.488 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Function)
14:10:03.488 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Function): calling childClassLoader.findClass()
14:10:03.488 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Function)
14:10:03.488 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Function): calling childClassLoader().findClass() 
14:10:03.489 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Function): class com.sun.jna.Function
14:10:03.489 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Function): class com.sun.jna.Function
14:10:03.490 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.ClassCastException)
14:10:03.490 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.ClassCastException): calling childClassLoader.findClass()
14:10:03.490 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.ClassCastException): class java.lang.ClassCastException
14:10:03.490 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.MethodParameterContext)
14:10:03.490 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.MethodParameterContext): calling childClassLoader.findClass()
14:10:03.490 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.MethodParameterContext)
14:10:03.490 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.MethodParameterContext): calling childClassLoader().findClass() 
14:10:03.490 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.FunctionParameterContext)
14:10:03.490 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.FunctionParameterContext): calling childClassLoader.findClass()
14:10:03.490 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.FunctionParameterContext)
14:10:03.490 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.FunctionParameterContext): calling childClassLoader().findClass() 
14:10:03.490 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.FunctionParameterContext): class com.sun.jna.FunctionParameterContext
14:10:03.490 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.FunctionParameterContext): class com.sun.jna.FunctionParameterContext
14:10:03.490 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.MethodParameterContext): class com.sun.jna.MethodParameterContext
14:10:03.490 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.MethodParameterContext): class com.sun.jna.MethodParameterContext
14:10:03.491 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.VarArgsChecker)
14:10:03.491 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.VarArgsChecker): calling childClassLoader.findClass()
14:10:03.491 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.VarArgsChecker)
14:10:03.491 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.VarArgsChecker): calling childClassLoader().findClass() 
14:10:03.491 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.VarArgsChecker): class com.sun.jna.VarArgsChecker
14:10:03.491 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.VarArgsChecker): class com.sun.jna.VarArgsChecker
14:10:03.491 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.VarArgsChecker$RealVarArgsChecker)
14:10:03.491 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.VarArgsChecker$RealVarArgsChecker): calling childClassLoader.findClass()
14:10:03.491 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.VarArgsChecker$RealVarArgsChecker)
14:10:03.491 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.VarArgsChecker$RealVarArgsChecker): calling childClassLoader().findClass() 
14:10:03.491 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.VarArgsChecker$RealVarArgsChecker): class com.sun.jna.VarArgsChecker$RealVarArgsChecker
14:10:03.491 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.VarArgsChecker$RealVarArgsChecker): class com.sun.jna.VarArgsChecker$RealVarArgsChecker
14:10:03.491 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.VarArgsChecker$NoVarArgsChecker)
14:10:03.491 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.VarArgsChecker$NoVarArgsChecker): calling childClassLoader.findClass()
14:10:03.491 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.VarArgsChecker$NoVarArgsChecker)
14:10:03.491 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.VarArgsChecker$NoVarArgsChecker): calling childClassLoader().findClass() 
14:10:03.491 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.VarArgsChecker$NoVarArgsChecker): class com.sun.jna.VarArgsChecker$NoVarArgsChecker
14:10:03.491 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.VarArgsChecker$NoVarArgsChecker): class com.sun.jna.VarArgsChecker$NoVarArgsChecker
14:10:03.492 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Library$Handler$FunctionInfo)
14:10:03.492 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Library$Handler$FunctionInfo): calling childClassLoader.findClass()
14:10:03.492 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Library$Handler$FunctionInfo)
14:10:03.492 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Library$Handler$FunctionInfo): calling childClassLoader().findClass() 
14:10:03.492 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Library$Handler$FunctionInfo): class com.sun.jna.Library$Handler$FunctionInfo
14:10:03.492 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Library$Handler$FunctionInfo): class com.sun.jna.Library$Handler$FunctionInfo
14:10:03.492 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Function$PostCallRead)
14:10:03.492 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Function$PostCallRead): calling childClassLoader.findClass()
14:10:03.492 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Function$PostCallRead)
14:10:03.492 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Function$PostCallRead): calling childClassLoader().findClass() 
14:10:03.493 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Function$PostCallRead): interface com.sun.jna.Function$PostCallRead
14:10:03.493 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Function$PostCallRead): interface com.sun.jna.Function$PostCallRead
14:10:03.493 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPluginCapability)
14:10:03.493 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPluginCapability): calling childClassLoader.findClass()
14:10:03.493 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPluginCapability)
14:10:03.493 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPluginCapability): calling childClassLoader().findClass() 
14:10:03.493 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPluginCapability): class org.apache.ranger.plugin.util.RangerPluginCapability
14:10:03.493 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPluginCapability): class org.apache.ranger.plugin.util.RangerPluginCapability
14:10:03.493 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPluginCapability$RangerPluginFeature)
14:10:03.493 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPluginCapability$RangerPluginFeature): calling childClassLoader.findClass()
14:10:03.493 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPluginCapability$RangerPluginFeature)
14:10:03.493 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPluginCapability$RangerPluginFeature): calling childClassLoader().findClass() 
14:10:03.494 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Enum)
14:10:03.494 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Enum): calling childClassLoader.findClass()
14:10:03.494 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Enum): class java.lang.Enum
14:10:03.494 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPluginCapability$RangerPluginFeature): class org.apache.ranger.plugin.util.RangerPluginCapability$RangerPluginFeature
14:10:03.494 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPluginCapability$RangerPluginFeature): class org.apache.ranger.plugin.util.RangerPluginCapability$RangerPluginFeature
14:10:03.495 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.GsonBuilder)
14:10:03.495 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.GsonBuilder): calling childClassLoader.findClass()
14:10:03.495 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.GsonBuilder)
14:10:03.495 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.GsonBuilder): calling childClassLoader().findClass() 
14:10:03.496 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.GsonBuilder): class com.google.gson.GsonBuilder
14:10:03.496 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.GsonBuilder): class com.google.gson.GsonBuilder
14:10:03.496 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingStrategy)
14:10:03.496 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingStrategy): calling childClassLoader.findClass()
14:10:03.496 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.FieldNamingStrategy)
14:10:03.496 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.FieldNamingStrategy): calling childClassLoader().findClass() 
14:10:03.496 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.FieldNamingStrategy): interface com.google.gson.FieldNamingStrategy
14:10:03.496 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingStrategy): interface com.google.gson.FieldNamingStrategy
14:10:03.496 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.Excluder)
14:10:03.496 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.Excluder): calling childClassLoader.findClass()
14:10:03.496 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.Excluder)
14:10:03.496 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.Excluder): calling childClassLoader().findClass() 
14:10:03.497 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.TypeAdapterFactory)
14:10:03.497 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.TypeAdapterFactory): calling childClassLoader.findClass()
14:10:03.497 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.TypeAdapterFactory)
14:10:03.497 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.TypeAdapterFactory): calling childClassLoader().findClass() 
14:10:03.497 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.TypeAdapterFactory): interface com.google.gson.TypeAdapterFactory
14:10:03.497 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.TypeAdapterFactory): interface com.google.gson.TypeAdapterFactory
14:10:03.497 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Cloneable)
14:10:03.497 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Cloneable): calling childClassLoader.findClass()
14:10:03.497 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Cloneable): interface java.lang.Cloneable
14:10:03.497 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.Excluder): class com.google.gson.internal.Excluder
14:10:03.497 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.Excluder): class com.google.gson.internal.Excluder
14:10:03.497 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.CloneNotSupportedException)
14:10:03.497 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.CloneNotSupportedException): calling childClassLoader.findClass()
14:10:03.497 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.CloneNotSupportedException): class java.lang.CloneNotSupportedException
14:10:03.497 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.TypeAdapter)
14:10:03.497 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.TypeAdapter): calling childClassLoader.findClass()
14:10:03.497 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.TypeAdapter)
14:10:03.497 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.TypeAdapter): calling childClassLoader().findClass() 
14:10:03.498 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.TypeAdapter): class com.google.gson.TypeAdapter
14:10:03.498 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.TypeAdapter): class com.google.gson.TypeAdapter
14:10:03.498 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.Excluder$1)
14:10:03.498 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.Excluder$1): calling childClassLoader.findClass()
14:10:03.498 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.Excluder$1)
14:10:03.498 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.Excluder$1): calling childClassLoader().findClass() 
14:10:03.498 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.Excluder$1): class com.google.gson.internal.Excluder$1
14:10:03.498 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.Excluder$1): class com.google.gson.internal.Excluder$1
14:10:03.498 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.LongSerializationPolicy)
14:10:03.498 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.LongSerializationPolicy): calling childClassLoader.findClass()
14:10:03.498 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.LongSerializationPolicy)
14:10:03.498 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.LongSerializationPolicy): calling childClassLoader().findClass() 
14:10:03.498 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.LongSerializationPolicy): class com.google.gson.LongSerializationPolicy
14:10:03.498 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.LongSerializationPolicy): class com.google.gson.LongSerializationPolicy
14:10:03.498 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.LongSerializationPolicy$1)
14:10:03.498 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.LongSerializationPolicy$1): calling childClassLoader.findClass()
14:10:03.499 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.LongSerializationPolicy$1)
14:10:03.499 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.LongSerializationPolicy$1): calling childClassLoader().findClass() 
14:10:03.499 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.LongSerializationPolicy$1): class com.google.gson.LongSerializationPolicy$1
14:10:03.499 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.LongSerializationPolicy$1): class com.google.gson.LongSerializationPolicy$1
14:10:03.499 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.LongSerializationPolicy$2)
14:10:03.499 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.LongSerializationPolicy$2): calling childClassLoader.findClass()
14:10:03.499 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.LongSerializationPolicy$2)
14:10:03.499 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.LongSerializationPolicy$2): calling childClassLoader().findClass() 
14:10:03.499 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.LongSerializationPolicy$2): class com.google.gson.LongSerializationPolicy$2
14:10:03.499 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.LongSerializationPolicy$2): class com.google.gson.LongSerializationPolicy$2
14:10:03.499 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.JsonElement)
14:10:03.499 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.JsonElement): calling childClassLoader.findClass()
14:10:03.499 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.JsonElement)
14:10:03.499 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.JsonElement): calling childClassLoader().findClass() 
14:10:03.499 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.JsonElement): class com.google.gson.JsonElement
14:10:03.499 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.JsonElement): class com.google.gson.JsonElement
14:10:03.500 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.JsonNull)
14:10:03.500 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.JsonNull): calling childClassLoader.findClass()
14:10:03.500 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.JsonNull)
14:10:03.500 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.JsonNull): calling childClassLoader().findClass() 
14:10:03.500 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.JsonNull): class com.google.gson.JsonNull
14:10:03.500 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.JsonNull): class com.google.gson.JsonNull
14:10:03.500 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.JsonPrimitive)
14:10:03.500 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.JsonPrimitive): calling childClassLoader.findClass()
14:10:03.500 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.JsonPrimitive)
14:10:03.500 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.JsonPrimitive): calling childClassLoader().findClass() 
14:10:03.500 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.JsonPrimitive): class com.google.gson.JsonPrimitive
14:10:03.500 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.JsonPrimitive): class com.google.gson.JsonPrimitive
14:10:03.500 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy)
14:10:03.500 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy): calling childClassLoader.findClass()
14:10:03.500 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy)
14:10:03.500 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy): calling childClassLoader().findClass() 
14:10:03.501 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy): class com.google.gson.FieldNamingPolicy
14:10:03.501 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy): class com.google.gson.FieldNamingPolicy
14:10:03.501 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$1)
14:10:03.501 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$1): calling childClassLoader.findClass()
14:10:03.501 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$1)
14:10:03.501 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$1): calling childClassLoader().findClass() 
14:10:03.501 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$1): class com.google.gson.FieldNamingPolicy$1
14:10:03.501 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$1): class com.google.gson.FieldNamingPolicy$1
14:10:03.501 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$2)
14:10:03.501 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$2): calling childClassLoader.findClass()
14:10:03.501 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$2)
14:10:03.501 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$2): calling childClassLoader().findClass() 
14:10:03.501 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$2): class com.google.gson.FieldNamingPolicy$2
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$2): class com.google.gson.FieldNamingPolicy$2
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$3)
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$3): calling childClassLoader.findClass()
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$3)
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$3): calling childClassLoader().findClass() 
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$3): class com.google.gson.FieldNamingPolicy$3
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$3): class com.google.gson.FieldNamingPolicy$3
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$4)
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$4): calling childClassLoader.findClass()
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$4)
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$4): calling childClassLoader().findClass() 
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$4): class com.google.gson.FieldNamingPolicy$4
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$4): class com.google.gson.FieldNamingPolicy$4
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$5)
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$5): calling childClassLoader.findClass()
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$5)
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$5): calling childClassLoader().findClass() 
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$5): class com.google.gson.FieldNamingPolicy$5
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$5): class com.google.gson.FieldNamingPolicy$5
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$6)
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$6): calling childClassLoader.findClass()
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$6)
14:10:03.502 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$6): calling childClassLoader().findClass() 
14:10:03.503 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$6): class com.google.gson.FieldNamingPolicy$6
14:10:03.503 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$6): class com.google.gson.FieldNamingPolicy$6
14:10:03.503 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$7)
14:10:03.503 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$7): calling childClassLoader.findClass()
14:10:03.503 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$7)
14:10:03.503 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$7): calling childClassLoader().findClass() 
14:10:03.503 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$7): class com.google.gson.FieldNamingPolicy$7
14:10:03.503 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$7): class com.google.gson.FieldNamingPolicy$7
14:10:03.503 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.Gson)
14:10:03.503 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.Gson): calling childClassLoader.findClass()
14:10:03.503 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.Gson)
14:10:03.503 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.Gson): calling childClassLoader().findClass() 
14:10:03.504 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.Gson): class com.google.gson.Gson
14:10:03.504 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.Gson): class com.google.gson.Gson
14:10:03.504 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.ToNumberStrategy)
14:10:03.504 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.ToNumberStrategy): calling childClassLoader.findClass()
14:10:03.504 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.ToNumberStrategy)
14:10:03.504 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.ToNumberStrategy): calling childClassLoader().findClass() 
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.ToNumberStrategy): interface com.google.gson.ToNumberStrategy
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.ToNumberStrategy): interface com.google.gson.ToNumberStrategy
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.Gson$FutureTypeAdapter)
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.Gson$FutureTypeAdapter): calling childClassLoader.findClass()
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.Gson$FutureTypeAdapter)
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.Gson$FutureTypeAdapter): calling childClassLoader().findClass() 
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.Gson$FutureTypeAdapter): class com.google.gson.Gson$FutureTypeAdapter
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.Gson$FutureTypeAdapter): class com.google.gson.Gson$FutureTypeAdapter
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.reflect.Type)
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.reflect.Type): calling childClassLoader.findClass()
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.reflect.Type): interface java.lang.reflect.Type
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.EOFException)
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.EOFException): calling childClassLoader.findClass()
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.EOFException): class java.io.EOFException
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.JsonSyntaxException)
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.JsonSyntaxException): calling childClassLoader.findClass()
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.JsonSyntaxException)
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.JsonSyntaxException): calling childClassLoader().findClass() 
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.JsonParseException)
14:10:03.505 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.JsonParseException): calling childClassLoader.findClass()
14:10:03.506 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.JsonParseException)
14:10:03.506 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.JsonParseException): calling childClassLoader().findClass() 
14:10:03.506 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.JsonParseException): class com.google.gson.JsonParseException
14:10:03.506 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.JsonParseException): class com.google.gson.JsonParseException
14:10:03.506 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.JsonSyntaxException): class com.google.gson.JsonSyntaxException
14:10:03.506 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.JsonSyntaxException): class com.google.gson.JsonSyntaxException
14:10:03.506 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.stream.JsonReader)
14:10:03.506 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.stream.JsonReader): calling childClassLoader.findClass()
14:10:03.506 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.stream.JsonReader)
14:10:03.506 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.stream.JsonReader): calling childClassLoader().findClass() 
14:10:03.506 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.Closeable)
14:10:03.507 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.Closeable): calling childClassLoader.findClass()
14:10:03.507 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.Closeable): interface java.io.Closeable
14:10:03.507 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.stream.JsonReader): class com.google.gson.stream.JsonReader
14:10:03.507 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.stream.JsonReader): class com.google.gson.stream.JsonReader
14:10:03.507 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.JsonTreeReader)
14:10:03.507 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.JsonTreeReader): calling childClassLoader.findClass()
14:10:03.507 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.JsonTreeReader)
14:10:03.507 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.JsonTreeReader): calling childClassLoader().findClass() 
14:10:03.507 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.JsonTreeReader): class com.google.gson.internal.bind.JsonTreeReader
14:10:03.507 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.JsonTreeReader): class com.google.gson.internal.bind.JsonTreeReader
14:10:03.507 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.StringReader)
14:10:03.507 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.StringReader): calling childClassLoader.findClass()
14:10:03.507 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.StringReader): class java.io.StringReader
14:10:03.508 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.JsonIOException)
14:10:03.508 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.JsonIOException): calling childClassLoader.findClass()
14:10:03.508 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.JsonIOException)
14:10:03.508 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.JsonIOException): calling childClassLoader().findClass() 
14:10:03.508 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.JsonIOException): class com.google.gson.JsonIOException
14:10:03.508 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.JsonIOException): class com.google.gson.JsonIOException
14:10:03.508 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Appendable)
14:10:03.508 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Appendable): calling childClassLoader.findClass()
14:10:03.508 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Appendable): interface java.lang.Appendable
14:10:03.508 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.stream.JsonWriter)
14:10:03.508 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.stream.JsonWriter): calling childClassLoader.findClass()
14:10:03.508 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.stream.JsonWriter)
14:10:03.508 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.stream.JsonWriter): calling childClassLoader().findClass() 
14:10:03.508 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.Flushable)
14:10:03.508 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.Flushable): calling childClassLoader.findClass()
14:10:03.509 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.Flushable): interface java.io.Flushable
14:10:03.509 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.stream.JsonWriter): class com.google.gson.stream.JsonWriter
14:10:03.509 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.stream.JsonWriter): class com.google.gson.stream.JsonWriter
14:10:03.509 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.JsonTreeWriter)
14:10:03.509 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.JsonTreeWriter): calling childClassLoader.findClass()
14:10:03.509 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.JsonTreeWriter)
14:10:03.509 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.JsonTreeWriter): calling childClassLoader().findClass() 
14:10:03.509 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.JsonTreeWriter): class com.google.gson.internal.bind.JsonTreeWriter
14:10:03.509 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.JsonTreeWriter): class com.google.gson.internal.bind.JsonTreeWriter
14:10:03.509 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.Gson$3)
14:10:03.509 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.Gson$3): calling childClassLoader.findClass()
14:10:03.509 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.Gson$3)
14:10:03.509 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.Gson$3): calling childClassLoader().findClass() 
14:10:03.509 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.Gson$3): class com.google.gson.Gson$3
14:10:03.509 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.Gson$3): class com.google.gson.Gson$3
14:10:03.510 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.Gson$1)
14:10:03.510 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.Gson$1): calling childClassLoader.findClass()
14:10:03.510 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.Gson$1)
14:10:03.510 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.Gson$1): calling childClassLoader().findClass() 
14:10:03.510 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.Gson$1): class com.google.gson.Gson$1
14:10:03.510 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.Gson$1): class com.google.gson.Gson$1
14:10:03.510 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.Gson$2)
14:10:03.510 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.Gson$2): calling childClassLoader.findClass()
14:10:03.510 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.Gson$2)
14:10:03.510 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.Gson$2): calling childClassLoader().findClass() 
14:10:03.510 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.Gson$2): class com.google.gson.Gson$2
14:10:03.510 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.Gson$2): class com.google.gson.Gson$2
14:10:03.510 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.stream.MalformedJsonException)
14:10:03.510 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.stream.MalformedJsonException): calling childClassLoader.findClass()
14:10:03.510 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.stream.MalformedJsonException)
14:10:03.510 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.stream.MalformedJsonException): calling childClassLoader().findClass() 
14:10:03.511 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.stream.MalformedJsonException): class com.google.gson.stream.MalformedJsonException
14:10:03.511 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.stream.MalformedJsonException): class com.google.gson.stream.MalformedJsonException
14:10:03.511 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy)
14:10:03.511 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy): calling childClassLoader.findClass()
14:10:03.511 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy)
14:10:03.511 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy): calling childClassLoader().findClass() 
14:10:03.511 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy): class com.google.gson.ToNumberPolicy
14:10:03.511 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy): class com.google.gson.ToNumberPolicy
14:10:03.511 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$1)
14:10:03.511 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$1): calling childClassLoader.findClass()
14:10:03.511 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$1)
14:10:03.511 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$1): calling childClassLoader().findClass() 
14:10:03.511 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$1): class com.google.gson.ToNumberPolicy$1
14:10:03.512 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$1): class com.google.gson.ToNumberPolicy$1
14:10:03.512 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$2)
14:10:03.512 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$2): calling childClassLoader.findClass()
14:10:03.512 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$2)
14:10:03.512 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$2): calling childClassLoader().findClass() 
14:10:03.512 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$2): class com.google.gson.ToNumberPolicy$2
14:10:03.512 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$2): class com.google.gson.ToNumberPolicy$2
14:10:03.512 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$3)
14:10:03.512 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$3): calling childClassLoader.findClass()
14:10:03.512 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$3)
14:10:03.512 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$3): calling childClassLoader().findClass() 
14:10:03.512 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$3): class com.google.gson.ToNumberPolicy$3
14:10:03.512 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$3): class com.google.gson.ToNumberPolicy$3
14:10:03.512 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$4)
14:10:03.512 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$4): calling childClassLoader.findClass()
14:10:03.512 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$4)
14:10:03.512 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$4): calling childClassLoader().findClass() 
14:10:03.512 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$4): class com.google.gson.ToNumberPolicy$4
14:10:03.512 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$4): class com.google.gson.ToNumberPolicy$4
14:10:03.513 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.LazilyParsedNumber)
14:10:03.513 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.LazilyParsedNumber): calling childClassLoader.findClass()
14:10:03.513 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.LazilyParsedNumber)
14:10:03.513 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.LazilyParsedNumber): calling childClassLoader().findClass() 
14:10:03.513 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.LazilyParsedNumber): class com.google.gson.internal.LazilyParsedNumber
14:10:03.513 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.LazilyParsedNumber): class com.google.gson.internal.LazilyParsedNumber
14:10:03.513 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.math.BigDecimal)
14:10:03.513 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.math.BigDecimal): calling childClassLoader.findClass()
14:10:03.513 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.math.BigDecimal): class java.math.BigDecimal
14:10:03.513 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.reflect.TypeToken)
14:10:03.513 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.reflect.TypeToken): calling childClassLoader.findClass()
14:10:03.513 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.reflect.TypeToken)
14:10:03.513 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.reflect.TypeToken): calling childClassLoader().findClass() 
14:10:03.514 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.reflect.TypeToken): class com.google.gson.reflect.TypeToken
14:10:03.514 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.reflect.TypeToken): class com.google.gson.reflect.TypeToken
14:10:03.514 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.$Gson$Preconditions)
14:10:03.514 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.$Gson$Preconditions): calling childClassLoader.findClass()
14:10:03.514 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.$Gson$Preconditions)
14:10:03.514 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.$Gson$Preconditions): calling childClassLoader().findClass() 
14:10:03.514 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.$Gson$Preconditions): class com.google.gson.internal.$Gson$Preconditions
14:10:03.514 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.$Gson$Preconditions): class com.google.gson.internal.$Gson$Preconditions
14:10:03.514 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.$Gson$Types)
14:10:03.514 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.$Gson$Types): calling childClassLoader.findClass()
14:10:03.515 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.$Gson$Types)
14:10:03.515 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.$Gson$Types): calling childClassLoader().findClass() 
14:10:03.515 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.$Gson$Types): class com.google.gson.internal.$Gson$Types
14:10:03.515 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.$Gson$Types): class com.google.gson.internal.$Gson$Types
14:10:03.515 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.NoSuchElementException)
14:10:03.515 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.NoSuchElementException): calling childClassLoader.findClass()
14:10:03.515 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.NoSuchElementException): class java.util.NoSuchElementException
14:10:03.516 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.reflect.ParameterizedType)
14:10:03.516 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.reflect.ParameterizedType): calling childClassLoader.findClass()
14:10:03.516 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.reflect.ParameterizedType): interface java.lang.reflect.ParameterizedType
14:10:03.516 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.reflect.GenericArrayType)
14:10:03.516 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.reflect.GenericArrayType): calling childClassLoader.findClass()
14:10:03.516 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.reflect.GenericArrayType): interface java.lang.reflect.GenericArrayType
14:10:03.516 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.reflect.WildcardType)
14:10:03.516 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.reflect.WildcardType): calling childClassLoader.findClass()
14:10:03.516 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.reflect.WildcardType): interface java.lang.reflect.WildcardType
14:10:03.516 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTypesSupport)
14:10:03.516 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTypesSupport): calling childClassLoader.findClass()
14:10:03.517 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTypesSupport)
14:10:03.517 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTypesSupport): calling childClassLoader().findClass() 
14:10:03.517 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTypesSupport): class com.google.gson.internal.sql.SqlTypesSupport
14:10:03.517 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTypesSupport): class com.google.gson.internal.sql.SqlTypesSupport
14:10:03.517 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType)
14:10:03.517 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType): calling childClassLoader.findClass()
14:10:03.517 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType)
14:10:03.517 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType): calling childClassLoader().findClass() 
14:10:03.518 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType): class com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType
14:10:03.518 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType): class com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType
14:10:03.518 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTypesSupport$1)
14:10:03.518 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTypesSupport$1): calling childClassLoader.findClass()
14:10:03.518 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTypesSupport$1)
14:10:03.518 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTypesSupport$1): calling childClassLoader().findClass() 
14:10:03.518 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTypesSupport$1): class com.google.gson.internal.sql.SqlTypesSupport$1
14:10:03.518 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTypesSupport$1): class com.google.gson.internal.sql.SqlTypesSupport$1
14:10:03.518 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTypesSupport$2)
14:10:03.518 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTypesSupport$2): calling childClassLoader.findClass()
14:10:03.518 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTypesSupport$2)
14:10:03.518 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTypesSupport$2): calling childClassLoader().findClass() 
14:10:03.519 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTypesSupport$2): class com.google.gson.internal.sql.SqlTypesSupport$2
14:10:03.519 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTypesSupport$2): class com.google.gson.internal.sql.SqlTypesSupport$2
14:10:03.519 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.sql.Date)
14:10:03.519 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.sql.Date): calling childClassLoader.findClass()
14:10:03.519 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(java.sql.Date)
14:10:03.519 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(java.sql.Date): calling childClassLoader().findClass() 
14:10:03.519 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(java.sql.Date): calling componentClassLoader.findClass()
14:10:03.519 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.sql.Date): calling componentClassLoader.loadClass()
14:10:03.520 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.sql.Date): class java.sql.Date
14:10:03.521 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType$1)
14:10:03.521 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType$1): calling childClassLoader.findClass()
14:10:03.521 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType$1)
14:10:03.521 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType$1): calling childClassLoader().findClass() 
14:10:03.521 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType$1): class com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType$1
14:10:03.521 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType$1): class com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType$1
14:10:03.521 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DefaultDateTypeAdapter)
14:10:03.521 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DefaultDateTypeAdapter): calling childClassLoader.findClass()
14:10:03.521 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DefaultDateTypeAdapter)
14:10:03.521 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DefaultDateTypeAdapter): calling childClassLoader().findClass() 
14:10:03.522 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DefaultDateTypeAdapter): class com.google.gson.internal.bind.DefaultDateTypeAdapter
14:10:03.522 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DefaultDateTypeAdapter): class com.google.gson.internal.bind.DefaultDateTypeAdapter
14:10:03.522 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Date)
14:10:03.522 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Date): calling childClassLoader.findClass()
14:10:03.522 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Date): class java.util.Date
14:10:03.522 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.sql.Timestamp)
14:10:03.522 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.sql.Timestamp): calling childClassLoader.findClass()
14:10:03.522 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(java.sql.Timestamp)
14:10:03.522 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(java.sql.Timestamp): calling childClassLoader().findClass() 
14:10:03.522 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(java.sql.Timestamp): calling componentClassLoader.findClass()
14:10:03.523 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.sql.Timestamp): calling componentClassLoader.loadClass()
14:10:03.523 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.sql.Timestamp): class java.sql.Timestamp
14:10:03.523 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlDateTypeAdapter)
14:10:03.523 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlDateTypeAdapter): calling childClassLoader.findClass()
14:10:03.523 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlDateTypeAdapter)
14:10:03.523 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlDateTypeAdapter): calling childClassLoader().findClass() 
14:10:03.523 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlDateTypeAdapter): class com.google.gson.internal.sql.SqlDateTypeAdapter
14:10:03.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlDateTypeAdapter): class com.google.gson.internal.sql.SqlDateTypeAdapter
14:10:03.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.TypeAdapter$1)
14:10:03.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.TypeAdapter$1): calling childClassLoader.findClass()
14:10:03.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.TypeAdapter$1)
14:10:03.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.TypeAdapter$1): calling childClassLoader().findClass() 
14:10:03.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.TypeAdapter$1): class com.google.gson.TypeAdapter$1
14:10:03.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.TypeAdapter$1): class com.google.gson.TypeAdapter$1
14:10:03.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlDateTypeAdapter$1)
14:10:03.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlDateTypeAdapter$1): calling childClassLoader.findClass()
14:10:03.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlDateTypeAdapter$1)
14:10:03.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlDateTypeAdapter$1): calling childClassLoader().findClass() 
14:10:03.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlDateTypeAdapter$1): class com.google.gson.internal.sql.SqlDateTypeAdapter$1
14:10:03.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlDateTypeAdapter$1): class com.google.gson.internal.sql.SqlDateTypeAdapter$1
14:10:03.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimeTypeAdapter)
14:10:03.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimeTypeAdapter): calling childClassLoader.findClass()
14:10:03.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimeTypeAdapter)
14:10:03.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimeTypeAdapter): calling childClassLoader().findClass() 
14:10:03.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimeTypeAdapter): class com.google.gson.internal.sql.SqlTimeTypeAdapter
14:10:03.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimeTypeAdapter): class com.google.gson.internal.sql.SqlTimeTypeAdapter
14:10:03.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.sql.Time)
14:10:03.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.sql.Time): calling childClassLoader.findClass()
14:10:03.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(java.sql.Time)
14:10:03.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(java.sql.Time): calling childClassLoader().findClass() 
14:10:03.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(java.sql.Time): calling componentClassLoader.findClass()
14:10:03.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.sql.Time): calling componentClassLoader.loadClass()
14:10:03.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.sql.Time): class java.sql.Time
14:10:03.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimeTypeAdapter$1)
14:10:03.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimeTypeAdapter$1): calling childClassLoader.findClass()
14:10:03.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimeTypeAdapter$1)
14:10:03.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimeTypeAdapter$1): calling childClassLoader().findClass() 
14:10:03.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimeTypeAdapter$1): class com.google.gson.internal.sql.SqlTimeTypeAdapter$1
14:10:03.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimeTypeAdapter$1): class com.google.gson.internal.sql.SqlTimeTypeAdapter$1
14:10:03.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter)
14:10:03.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter): calling childClassLoader.findClass()
14:10:03.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter)
14:10:03.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter): calling childClassLoader().findClass() 
14:10:03.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter): class com.google.gson.internal.sql.SqlTimestampTypeAdapter
14:10:03.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter): class com.google.gson.internal.sql.SqlTimestampTypeAdapter
14:10:03.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter$1)
14:10:03.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter$1): calling childClassLoader.findClass()
14:10:03.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter$1)
14:10:03.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter$1): calling childClassLoader().findClass() 
14:10:03.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter$1): class com.google.gson.internal.sql.SqlTimestampTypeAdapter$1
14:10:03.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter$1): class com.google.gson.internal.sql.SqlTimestampTypeAdapter$1
14:10:03.527 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Locale)
14:10:03.527 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Locale): calling childClassLoader.findClass()
14:10:03.527 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Locale): class java.util.Locale
14:10:03.527 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters)
14:10:03.527 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters): calling childClassLoader.findClass()
14:10:03.527 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters)
14:10:03.527 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters): calling childClassLoader().findClass() 
14:10:03.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters): class com.google.gson.internal.bind.TypeAdapters
14:10:03.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters): class com.google.gson.internal.bind.TypeAdapters
14:10:03.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$3)
14:10:03.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$3): calling childClassLoader.findClass()
14:10:03.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$3)
14:10:03.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$3): calling childClassLoader().findClass() 
14:10:03.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$3): class com.google.gson.internal.bind.TypeAdapters$3
14:10:03.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$3): class com.google.gson.internal.bind.TypeAdapters$3
14:10:03.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$4)
14:10:03.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$4): calling childClassLoader.findClass()
14:10:03.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$4)
14:10:03.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$4): calling childClassLoader().findClass() 
14:10:03.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$4): class com.google.gson.internal.bind.TypeAdapters$4
14:10:03.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$4): class com.google.gson.internal.bind.TypeAdapters$4
14:10:03.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$5)
14:10:03.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$5): calling childClassLoader.findClass()
14:10:03.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$5)
14:10:03.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$5): calling childClassLoader().findClass() 
14:10:03.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$5): class com.google.gson.internal.bind.TypeAdapters$5
14:10:03.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$5): class com.google.gson.internal.bind.TypeAdapters$5
14:10:03.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$6)
14:10:03.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$6): calling childClassLoader.findClass()
14:10:03.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$6)
14:10:03.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$6): calling childClassLoader().findClass() 
14:10:03.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$6): class com.google.gson.internal.bind.TypeAdapters$6
14:10:03.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$6): class com.google.gson.internal.bind.TypeAdapters$6
14:10:03.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$7)
14:10:03.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$7): calling childClassLoader.findClass()
14:10:03.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$7)
14:10:03.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$7): calling childClassLoader().findClass() 
14:10:03.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$7): class com.google.gson.internal.bind.TypeAdapters$7
14:10:03.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$7): class com.google.gson.internal.bind.TypeAdapters$7
14:10:03.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$11)
14:10:03.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$11): calling childClassLoader.findClass()
14:10:03.530 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$11)
14:10:03.530 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$11): calling childClassLoader().findClass() 
14:10:03.530 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$11): class com.google.gson.internal.bind.TypeAdapters$11
14:10:03.530 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$11): class com.google.gson.internal.bind.TypeAdapters$11
14:10:03.530 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$12)
14:10:03.530 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$12): calling childClassLoader.findClass()
14:10:03.530 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$12)
14:10:03.530 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$12): calling childClassLoader().findClass() 
14:10:03.530 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$12): class com.google.gson.internal.bind.TypeAdapters$12
14:10:03.530 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$12): class com.google.gson.internal.bind.TypeAdapters$12
14:10:03.530 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$13)
14:10:03.530 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$13): calling childClassLoader.findClass()
14:10:03.530 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$13)
14:10:03.530 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$13): calling childClassLoader().findClass() 
14:10:03.530 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$13): class com.google.gson.internal.bind.TypeAdapters$13
14:10:03.530 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$13): class com.google.gson.internal.bind.TypeAdapters$13
14:10:03.530 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$14)
14:10:03.530 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$14): calling childClassLoader.findClass()
14:10:03.530 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$14)
14:10:03.531 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$14): calling childClassLoader().findClass() 
14:10:03.531 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$14): class com.google.gson.internal.bind.TypeAdapters$14
14:10:03.531 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$14): class com.google.gson.internal.bind.TypeAdapters$14
14:10:03.531 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$15)
14:10:03.531 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$15): calling childClassLoader.findClass()
14:10:03.531 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$15)
14:10:03.531 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$15): calling childClassLoader().findClass() 
14:10:03.531 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$15): class com.google.gson.internal.bind.TypeAdapters$15
14:10:03.531 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$15): class com.google.gson.internal.bind.TypeAdapters$15
14:10:03.531 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$16)
14:10:03.531 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$16): calling childClassLoader.findClass()
14:10:03.531 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$16)
14:10:03.531 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$16): calling childClassLoader().findClass() 
14:10:03.532 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$16): class com.google.gson.internal.bind.TypeAdapters$16
14:10:03.532 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$16): class com.google.gson.internal.bind.TypeAdapters$16
14:10:03.532 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$17)
14:10:03.532 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$17): calling childClassLoader.findClass()
14:10:03.532 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$17)
14:10:03.532 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$17): calling childClassLoader().findClass() 
14:10:03.532 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$17): class com.google.gson.internal.bind.TypeAdapters$17
14:10:03.532 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$17): class com.google.gson.internal.bind.TypeAdapters$17
14:10:03.532 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$18)
14:10:03.532 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$18): calling childClassLoader.findClass()
14:10:03.532 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$18)
14:10:03.532 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$18): calling childClassLoader().findClass() 
14:10:03.532 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$18): class com.google.gson.internal.bind.TypeAdapters$18
14:10:03.533 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$18): class com.google.gson.internal.bind.TypeAdapters$18
14:10:03.533 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$19)
14:10:03.533 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$19): calling childClassLoader.findClass()
14:10:03.533 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$19)
14:10:03.533 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$19): calling childClassLoader().findClass() 
14:10:03.533 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$19): class com.google.gson.internal.bind.TypeAdapters$19
14:10:03.533 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$19): class com.google.gson.internal.bind.TypeAdapters$19
14:10:03.533 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$20)
14:10:03.533 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$20): calling childClassLoader.findClass()
14:10:03.533 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$20)
14:10:03.533 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$20): calling childClassLoader().findClass() 
14:10:03.533 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$20): class com.google.gson.internal.bind.TypeAdapters$20
14:10:03.533 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$20): class com.google.gson.internal.bind.TypeAdapters$20
14:10:03.533 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$21)
14:10:03.533 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$21): calling childClassLoader.findClass()
14:10:03.533 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$21)
14:10:03.533 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$21): calling childClassLoader().findClass() 
14:10:03.534 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$21): class com.google.gson.internal.bind.TypeAdapters$21
14:10:03.534 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$21): class com.google.gson.internal.bind.TypeAdapters$21
14:10:03.534 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$22)
14:10:03.534 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$22): calling childClassLoader.findClass()
14:10:03.534 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$22)
14:10:03.534 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$22): calling childClassLoader().findClass() 
14:10:03.534 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$22): class com.google.gson.internal.bind.TypeAdapters$22
14:10:03.534 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$22): class com.google.gson.internal.bind.TypeAdapters$22
14:10:03.534 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$23)
14:10:03.534 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$23): calling childClassLoader.findClass()
14:10:03.534 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$23)
14:10:03.534 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$23): calling childClassLoader().findClass() 
14:10:03.534 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$23): class com.google.gson.internal.bind.TypeAdapters$23
14:10:03.534 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$23): class com.google.gson.internal.bind.TypeAdapters$23
14:10:03.534 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$24)
14:10:03.534 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$24): calling childClassLoader.findClass()
14:10:03.534 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$24)
14:10:03.534 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$24): calling childClassLoader().findClass() 
14:10:03.534 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$24): class com.google.gson.internal.bind.TypeAdapters$24
14:10:03.535 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$24): class com.google.gson.internal.bind.TypeAdapters$24
14:10:03.535 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$26)
14:10:03.535 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$26): calling childClassLoader.findClass()
14:10:03.535 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$26)
14:10:03.535 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$26): calling childClassLoader().findClass() 
14:10:03.535 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$26): class com.google.gson.internal.bind.TypeAdapters$26
14:10:03.535 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$26): class com.google.gson.internal.bind.TypeAdapters$26
14:10:03.535 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$27)
14:10:03.535 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$27): calling childClassLoader.findClass()
14:10:03.535 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$27)
14:10:03.535 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$27): calling childClassLoader().findClass() 
14:10:03.535 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$27): class com.google.gson.internal.bind.TypeAdapters$27
14:10:03.535 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$27): class com.google.gson.internal.bind.TypeAdapters$27
14:10:03.535 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$28)
14:10:03.535 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$28): calling childClassLoader.findClass()
14:10:03.535 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$28)
14:10:03.536 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$28): calling childClassLoader().findClass() 
14:10:03.536 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$28): class com.google.gson.internal.bind.TypeAdapters$28
14:10:03.536 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$28): class com.google.gson.internal.bind.TypeAdapters$28
14:10:03.536 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$1)
14:10:03.536 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$1): calling childClassLoader.findClass()
14:10:03.536 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$1)
14:10:03.536 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$1): calling childClassLoader().findClass() 
14:10:03.536 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$1): class com.google.gson.internal.bind.TypeAdapters$1
14:10:03.536 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$1): class com.google.gson.internal.bind.TypeAdapters$1
14:10:03.536 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$31)
14:10:03.536 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$31): calling childClassLoader.findClass()
14:10:03.536 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$31)
14:10:03.536 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$31): calling childClassLoader().findClass() 
14:10:03.537 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$31): class com.google.gson.internal.bind.TypeAdapters$31
14:10:03.537 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$31): class com.google.gson.internal.bind.TypeAdapters$31
14:10:03.537 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$2)
14:10:03.537 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$2): calling childClassLoader.findClass()
14:10:03.537 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$2)
14:10:03.537 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$2): calling childClassLoader().findClass() 
14:10:03.537 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$2): class com.google.gson.internal.bind.TypeAdapters$2
14:10:03.537 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$2): class com.google.gson.internal.bind.TypeAdapters$2
14:10:03.537 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.BitSet)
14:10:03.537 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.BitSet): calling childClassLoader.findClass()
14:10:03.537 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.BitSet): class java.util.BitSet
14:10:03.537 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$32)
14:10:03.538 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$32): calling childClassLoader.findClass()
14:10:03.538 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$32)
14:10:03.538 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$32): calling childClassLoader().findClass() 
14:10:03.538 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$32): class com.google.gson.internal.bind.TypeAdapters$32
14:10:03.538 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$32): class com.google.gson.internal.bind.TypeAdapters$32
14:10:03.538 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$8)
14:10:03.538 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$8): calling childClassLoader.findClass()
14:10:03.538 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$8)
14:10:03.538 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$8): calling childClassLoader().findClass() 
14:10:03.538 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$8): class com.google.gson.internal.bind.TypeAdapters$8
14:10:03.539 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$8): class com.google.gson.internal.bind.TypeAdapters$8
14:10:03.539 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicInteger)
14:10:03.539 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicInteger): calling childClassLoader.findClass()
14:10:03.539 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicInteger): class java.util.concurrent.atomic.AtomicInteger
14:10:03.539 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$9)
14:10:03.539 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$9): calling childClassLoader.findClass()
14:10:03.539 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$9)
14:10:03.539 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$9): calling childClassLoader().findClass() 
14:10:03.539 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$9): class com.google.gson.internal.bind.TypeAdapters$9
14:10:03.539 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$9): class com.google.gson.internal.bind.TypeAdapters$9
14:10:03.539 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$10)
14:10:03.539 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$10): calling childClassLoader.findClass()
14:10:03.539 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$10)
14:10:03.539 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$10): calling childClassLoader().findClass() 
14:10:03.539 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$10): class com.google.gson.internal.bind.TypeAdapters$10
14:10:03.539 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$10): class com.google.gson.internal.bind.TypeAdapters$10
14:10:03.539 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicIntegerArray)
14:10:03.539 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicIntegerArray): calling childClassLoader.findClass()
14:10:03.540 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicIntegerArray): class java.util.concurrent.atomic.AtomicIntegerArray
14:10:03.540 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.math.BigInteger)
14:10:03.540 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.math.BigInteger): calling childClassLoader.findClass()
14:10:03.540 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.math.BigInteger): class java.math.BigInteger
14:10:03.540 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.StringBuffer)
14:10:03.540 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.StringBuffer): calling childClassLoader.findClass()
14:10:03.540 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.StringBuffer): class java.lang.StringBuffer
14:10:03.540 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.net.URI)
14:10:03.540 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.net.URI): calling childClassLoader.findClass()
14:10:03.540 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.net.URI): class java.net.URI
14:10:03.540 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$34)
14:10:03.540 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$34): calling childClassLoader.findClass()
14:10:03.540 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$34)
14:10:03.540 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$34): calling childClassLoader().findClass() 
14:10:03.541 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$34): class com.google.gson.internal.bind.TypeAdapters$34
14:10:03.541 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$34): class com.google.gson.internal.bind.TypeAdapters$34
14:10:03.541 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$34$1)
14:10:03.541 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$34$1): calling childClassLoader.findClass()
14:10:03.541 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$34$1)
14:10:03.541 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$34$1): calling childClassLoader().findClass() 
14:10:03.541 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$34$1): class com.google.gson.internal.bind.TypeAdapters$34$1
14:10:03.541 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$34$1): class com.google.gson.internal.bind.TypeAdapters$34$1
14:10:03.541 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.UUID)
14:10:03.541 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.UUID): calling childClassLoader.findClass()
14:10:03.541 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.UUID): class java.util.UUID
14:10:03.541 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$25)
14:10:03.541 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$25): calling childClassLoader.findClass()
14:10:03.541 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$25)
14:10:03.541 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$25): calling childClassLoader().findClass() 
14:10:03.542 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$25): class com.google.gson.internal.bind.TypeAdapters$25
14:10:03.542 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$25): class com.google.gson.internal.bind.TypeAdapters$25
14:10:03.542 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Currency)
14:10:03.542 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Currency): calling childClassLoader.findClass()
14:10:03.542 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Currency): class java.util.Currency
14:10:03.542 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Calendar)
14:10:03.542 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Calendar): calling childClassLoader.findClass()
14:10:03.542 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Calendar): class java.util.Calendar
14:10:03.542 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.GregorianCalendar)
14:10:03.542 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.GregorianCalendar): calling childClassLoader.findClass()
14:10:03.542 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.GregorianCalendar): class java.util.GregorianCalendar
14:10:03.543 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$33)
14:10:03.543 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$33): calling childClassLoader.findClass()
14:10:03.543 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$33)
14:10:03.543 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$33): calling childClassLoader().findClass() 
14:10:03.543 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$33): class com.google.gson.internal.bind.TypeAdapters$33
14:10:03.543 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$33): class com.google.gson.internal.bind.TypeAdapters$33
14:10:03.543 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.JsonArray)
14:10:03.543 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.JsonArray): calling childClassLoader.findClass()
14:10:03.543 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.JsonArray)
14:10:03.543 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.JsonArray): calling childClassLoader().findClass() 
14:10:03.543 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Iterable)
14:10:03.543 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Iterable): calling childClassLoader.findClass()
14:10:03.543 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Iterable): interface java.lang.Iterable
14:10:03.544 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.JsonArray): class com.google.gson.JsonArray
14:10:03.544 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.JsonArray): class com.google.gson.JsonArray
14:10:03.544 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.JsonObject)
14:10:03.544 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.JsonObject): calling childClassLoader.findClass()
14:10:03.544 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.JsonObject)
14:10:03.544 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.JsonObject): calling childClassLoader().findClass() 
14:10:03.544 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.JsonObject): class com.google.gson.JsonObject
14:10:03.544 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.JsonObject): class com.google.gson.JsonObject
14:10:03.544 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$29)
14:10:03.544 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$29): calling childClassLoader.findClass()
14:10:03.544 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$29)
14:10:03.544 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$29): calling childClassLoader().findClass() 
14:10:03.544 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$29): class com.google.gson.internal.bind.TypeAdapters$29
14:10:03.544 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$29): class com.google.gson.internal.bind.TypeAdapters$29
14:10:03.544 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$EnumTypeAdapter)
14:10:03.544 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$EnumTypeAdapter): calling childClassLoader.findClass()
14:10:03.545 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$EnumTypeAdapter)
14:10:03.545 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$EnumTypeAdapter): calling childClassLoader().findClass() 
14:10:03.545 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$EnumTypeAdapter): class com.google.gson.internal.bind.TypeAdapters$EnumTypeAdapter
14:10:03.545 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$EnumTypeAdapter): class com.google.gson.internal.bind.TypeAdapters$EnumTypeAdapter
14:10:03.545 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.ConcurrentHashMap)
14:10:03.545 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.ConcurrentHashMap): calling childClassLoader.findClass()
14:10:03.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.ConcurrentHashMap): class java.util.concurrent.ConcurrentHashMap
14:10:03.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.ConstructorConstructor)
14:10:03.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.ConstructorConstructor): calling childClassLoader.findClass()
14:10:03.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.ConstructorConstructor)
14:10:03.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.ConstructorConstructor): calling childClassLoader().findClass() 
14:10:03.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.ConstructorConstructor): class com.google.gson.internal.ConstructorConstructor
14:10:03.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.ConstructorConstructor): class com.google.gson.internal.ConstructorConstructor
14:10:03.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.ObjectConstructor)
14:10:03.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.ObjectConstructor): calling childClassLoader.findClass()
14:10:03.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.ObjectConstructor)
14:10:03.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.ObjectConstructor): calling childClassLoader().findClass() 
14:10:03.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.ObjectConstructor): interface com.google.gson.internal.ObjectConstructor
14:10:03.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.ObjectConstructor): interface com.google.gson.internal.ObjectConstructor
14:10:03.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ObjectTypeAdapter)
14:10:03.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ObjectTypeAdapter): calling childClassLoader.findClass()
14:10:03.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ObjectTypeAdapter)
14:10:03.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ObjectTypeAdapter): calling childClassLoader().findClass() 
14:10:03.547 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ObjectTypeAdapter): class com.google.gson.internal.bind.ObjectTypeAdapter
14:10:03.547 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ObjectTypeAdapter): class com.google.gson.internal.bind.ObjectTypeAdapter
14:10:03.547 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ObjectTypeAdapter$1)
14:10:03.547 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ObjectTypeAdapter$1): calling childClassLoader.findClass()
14:10:03.547 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ObjectTypeAdapter$1)
14:10:03.547 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ObjectTypeAdapter$1): calling childClassLoader().findClass() 
14:10:03.547 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ObjectTypeAdapter$1): class com.google.gson.internal.bind.ObjectTypeAdapter$1
14:10:03.547 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ObjectTypeAdapter$1): class com.google.gson.internal.bind.ObjectTypeAdapter$1
14:10:03.547 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.NumberTypeAdapter)
14:10:03.547 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.NumberTypeAdapter): calling childClassLoader.findClass()
14:10:03.547 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.NumberTypeAdapter)
14:10:03.547 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.NumberTypeAdapter): calling childClassLoader().findClass() 
14:10:03.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.NumberTypeAdapter): class com.google.gson.internal.bind.NumberTypeAdapter
14:10:03.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.NumberTypeAdapter): class com.google.gson.internal.bind.NumberTypeAdapter
14:10:03.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.NumberTypeAdapter$1)
14:10:03.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.NumberTypeAdapter$1): calling childClassLoader.findClass()
14:10:03.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.NumberTypeAdapter$1)
14:10:03.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.NumberTypeAdapter$1): calling childClassLoader().findClass() 
14:10:03.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.NumberTypeAdapter$1): class com.google.gson.internal.bind.NumberTypeAdapter$1
14:10:03.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.NumberTypeAdapter$1): class com.google.gson.internal.bind.NumberTypeAdapter$1
14:10:03.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicLong)
14:10:03.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicLong): calling childClassLoader.findClass()
14:10:03.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicLong): class java.util.concurrent.atomic.AtomicLong
14:10:03.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.Gson$4)
14:10:03.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.Gson$4): calling childClassLoader.findClass()
14:10:03.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.Gson$4)
14:10:03.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.Gson$4): calling childClassLoader().findClass() 
14:10:03.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.Gson$4): class com.google.gson.Gson$4
14:10:03.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.Gson$4): class com.google.gson.Gson$4
14:10:03.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicLongArray)
14:10:03.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicLongArray): calling childClassLoader.findClass()
14:10:03.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicLongArray): class java.util.concurrent.atomic.AtomicLongArray
14:10:03.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.Gson$5)
14:10:03.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.Gson$5): calling childClassLoader.findClass()
14:10:03.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.Gson$5)
14:10:03.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.Gson$5): calling childClassLoader().findClass() 
14:10:03.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.Gson$5): class com.google.gson.Gson$5
14:10:03.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.Gson$5): class com.google.gson.Gson$5
14:10:03.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DateTypeAdapter)
14:10:03.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DateTypeAdapter): calling childClassLoader.findClass()
14:10:03.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DateTypeAdapter)
14:10:03.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DateTypeAdapter): calling childClassLoader().findClass() 
14:10:03.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DateTypeAdapter): class com.google.gson.internal.bind.DateTypeAdapter
14:10:03.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DateTypeAdapter): class com.google.gson.internal.bind.DateTypeAdapter
14:10:03.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DateTypeAdapter$1)
14:10:03.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DateTypeAdapter$1): calling childClassLoader.findClass()
14:10:03.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DateTypeAdapter$1)
14:10:03.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DateTypeAdapter$1): calling childClassLoader().findClass() 
14:10:03.550 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DateTypeAdapter$1): class com.google.gson.internal.bind.DateTypeAdapter$1
14:10:03.550 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DateTypeAdapter$1): class com.google.gson.internal.bind.DateTypeAdapter$1
14:10:03.550 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ArrayTypeAdapter)
14:10:03.550 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ArrayTypeAdapter): calling childClassLoader.findClass()
14:10:03.550 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ArrayTypeAdapter)
14:10:03.550 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ArrayTypeAdapter): calling childClassLoader().findClass() 
14:10:03.550 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ArrayTypeAdapter): class com.google.gson.internal.bind.ArrayTypeAdapter
14:10:03.550 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ArrayTypeAdapter): class com.google.gson.internal.bind.ArrayTypeAdapter
14:10:03.550 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper)
14:10:03.550 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper): calling childClassLoader.findClass()
14:10:03.550 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper)
14:10:03.550 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper): calling childClassLoader().findClass() 
14:10:03.550 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper): class com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper
14:10:03.550 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper): class com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper
14:10:03.550 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ArrayTypeAdapter$1)
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ArrayTypeAdapter$1): calling childClassLoader.findClass()
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ArrayTypeAdapter$1)
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ArrayTypeAdapter$1): calling childClassLoader().findClass() 
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ArrayTypeAdapter$1): class com.google.gson.internal.bind.ArrayTypeAdapter$1
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ArrayTypeAdapter$1): class com.google.gson.internal.bind.ArrayTypeAdapter$1
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory)
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory): calling childClassLoader.findClass()
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory)
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory): calling childClassLoader().findClass() 
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory): class com.google.gson.internal.bind.CollectionTypeAdapterFactory
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory): class com.google.gson.internal.bind.CollectionTypeAdapterFactory
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter)
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter): calling childClassLoader.findClass()
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter)
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter): calling childClassLoader().findClass() 
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter): class com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter): class com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.MapTypeAdapterFactory)
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.MapTypeAdapterFactory): calling childClassLoader.findClass()
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.MapTypeAdapterFactory)
14:10:03.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.MapTypeAdapterFactory): calling childClassLoader().findClass() 
14:10:03.552 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.MapTypeAdapterFactory): class com.google.gson.internal.bind.MapTypeAdapterFactory
14:10:03.552 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.MapTypeAdapterFactory): class com.google.gson.internal.bind.MapTypeAdapterFactory
14:10:03.552 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter)
14:10:03.552 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter): calling childClassLoader.findClass()
14:10:03.552 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter)
14:10:03.552 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter): calling childClassLoader().findClass() 
14:10:03.552 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter): class com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter
14:10:03.552 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter): class com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter
14:10:03.552 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.JsonAdapterAnnotationTypeAdapterFactory)
14:10:03.552 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.JsonAdapterAnnotationTypeAdapterFactory): calling childClassLoader.findClass()
14:10:03.552 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.JsonAdapterAnnotationTypeAdapterFactory)
14:10:03.552 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.JsonAdapterAnnotationTypeAdapterFactory): calling childClassLoader().findClass() 
14:10:03.553 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.JsonAdapterAnnotationTypeAdapterFactory): class com.google.gson.internal.bind.JsonAdapterAnnotationTypeAdapterFactory
14:10:03.553 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.JsonAdapterAnnotationTypeAdapterFactory): class com.google.gson.internal.bind.JsonAdapterAnnotationTypeAdapterFactory
14:10:03.553 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TreeTypeAdapter)
14:10:03.553 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TreeTypeAdapter): calling childClassLoader.findClass()
14:10:03.553 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TreeTypeAdapter)
14:10:03.553 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TreeTypeAdapter): calling childClassLoader().findClass() 
14:10:03.553 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TreeTypeAdapter): class com.google.gson.internal.bind.TreeTypeAdapter
14:10:03.553 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TreeTypeAdapter): class com.google.gson.internal.bind.TreeTypeAdapter
14:10:03.553 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory)
14:10:03.553 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory): calling childClassLoader.findClass()
14:10:03.553 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory)
14:10:03.553 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory): calling childClassLoader().findClass() 
14:10:03.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory): class com.google.gson.internal.bind.ReflectiveTypeAdapterFactory
14:10:03.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory): class com.google.gson.internal.bind.ReflectiveTypeAdapterFactory
14:10:03.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter)
14:10:03.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter): calling childClassLoader.findClass()
14:10:03.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter)
14:10:03.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter): calling childClassLoader().findClass() 
14:10:03.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter): class com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter
14:10:03.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter): class com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter
14:10:03.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$BoundField)
14:10:03.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$BoundField): calling childClassLoader.findClass()
14:10:03.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$BoundField)
14:10:03.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$BoundField): calling childClassLoader().findClass() 
14:10:03.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$BoundField): class com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$BoundField
14:10:03.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$BoundField): class com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$BoundField
14:10:03.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1)
14:10:03.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1): calling childClassLoader.findClass()
14:10:03.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1)
14:10:03.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1): calling childClassLoader().findClass() 
14:10:03.555 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1): class com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1
14:10:03.555 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1): class com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1
14:10:03.555 [main] DEBUG org.apache.ranger.admin.client.RangerAdminRESTClient -- ==> RangerAdminRESTClient.init(http://127.25.254.212:44445, /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ranger-kms-policymgr-ssl.xml)
14:10:03.555 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRESTClient)
14:10:03.555 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRESTClient): calling childClassLoader.findClass()
14:10:03.555 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRESTClient)
14:10:03.555 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRESTClient): calling childClassLoader().findClass() 
14:10:03.556 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRESTClient): class org.apache.ranger.plugin.util.RangerRESTClient
14:10:03.556 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRESTClient): class org.apache.ranger.plugin.util.RangerRESTClient
14:10:03.556 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.ClientHandlerException)
14:10:03.556 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.ClientHandlerException): calling childClassLoader.findClass()
14:10:03.556 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.api.client.ClientHandlerException)
14:10:03.556 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.ClientHandlerException): calling childClassLoader().findClass() 
14:10:03.556 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.ClientHandlerException): calling componentClassLoader.findClass()
14:10:03.556 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.ClientHandlerException): calling componentClassLoader.loadClass()
14:10:03.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.ClientHandlerException): class com.sun.jersey.api.client.ClientHandlerException
14:10:03.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.security.NoSuchAlgorithmException)
14:10:03.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.security.NoSuchAlgorithmException): calling childClassLoader.findClass()
14:10:03.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.security.NoSuchAlgorithmException): class java.security.NoSuchAlgorithmException
14:10:03.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.security.KeyStoreException)
14:10:03.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.security.KeyStoreException): calling childClassLoader.findClass()
14:10:03.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.security.KeyStoreException): class java.security.KeyStoreException
14:10:03.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.security.KeyManagementException)
14:10:03.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.security.KeyManagementException): calling childClassLoader.findClass()
14:10:03.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.security.KeyManagementException): class java.security.KeyManagementException
14:10:03.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.security.cert.CertificateException)
14:10:03.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.security.cert.CertificateException): calling childClassLoader.findClass()
14:10:03.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.security.cert.CertificateException): class java.security.cert.CertificateException
14:10:03.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.FileNotFoundException)
14:10:03.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.FileNotFoundException): calling childClassLoader.findClass()
14:10:03.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.FileNotFoundException): class java.io.FileNotFoundException
14:10:03.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.security.UnrecoverableKeyException)
14:10:03.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.security.UnrecoverableKeyException): calling childClassLoader.findClass()
14:10:03.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.security.UnrecoverableKeyException): class java.security.UnrecoverableKeyException
14:10:03.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.FileInputStream)
14:10:03.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.FileInputStream): calling childClassLoader.findClass()
14:10:03.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.FileInputStream): class java.io.FileInputStream
14:10:03.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.config.ClientConfig)
14:10:03.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.config.ClientConfig): calling childClassLoader.findClass()
14:10:03.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.api.client.config.ClientConfig)
14:10:03.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.config.ClientConfig): calling childClassLoader().findClass() 
14:10:03.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.config.ClientConfig): calling componentClassLoader.findClass()
14:10:03.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.config.ClientConfig): calling componentClassLoader.loadClass()
14:10:03.561 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.config.ClientConfig): interface com.sun.jersey.api.client.config.ClientConfig
14:10:03.561 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.net.ssl.HostnameVerifier)
14:10:03.561 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.net.ssl.HostnameVerifier): calling childClassLoader.findClass()
14:10:03.561 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.net.ssl.HostnameVerifier): interface javax.net.ssl.HostnameVerifier
14:10:03.561 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.filter.ClientFilter)
14:10:03.561 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.filter.ClientFilter): calling childClassLoader.findClass()
14:10:03.561 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.api.client.filter.ClientFilter)
14:10:03.561 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.filter.ClientFilter): calling childClassLoader().findClass() 
14:10:03.561 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.filter.ClientFilter): calling componentClassLoader.findClass()
14:10:03.561 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.filter.ClientFilter): calling componentClassLoader.loadClass()
14:10:03.562 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.filter.ClientFilter): class com.sun.jersey.api.client.filter.ClientFilter
14:10:03.563 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.filter.HTTPBasicAuthFilter)
14:10:03.563 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.filter.HTTPBasicAuthFilter): calling childClassLoader.findClass()
14:10:03.563 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.api.client.filter.HTTPBasicAuthFilter)
14:10:03.563 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.filter.HTTPBasicAuthFilter): calling childClassLoader().findClass() 
14:10:03.563 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.filter.HTTPBasicAuthFilter): calling componentClassLoader.findClass()
14:10:03.563 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.filter.HTTPBasicAuthFilter): calling componentClassLoader.loadClass()
14:10:03.564 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.filter.HTTPBasicAuthFilter): class com.sun.jersey.api.client.filter.HTTPBasicAuthFilter
14:10:03.565 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.net.ssl.KeyManagerFactory)
14:10:03.565 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.net.ssl.KeyManagerFactory): calling childClassLoader.findClass()
14:10:03.565 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.net.ssl.KeyManagerFactory): class javax.net.ssl.KeyManagerFactory
14:10:03.565 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.net.ssl.TrustManagerFactory)
14:10:03.565 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.net.ssl.TrustManagerFactory): calling childClassLoader.findClass()
14:10:03.565 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.net.ssl.TrustManagerFactory): class javax.net.ssl.TrustManagerFactory
14:10:03.565 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Random)
14:10:03.565 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Random): calling childClassLoader.findClass()
14:10:03.565 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Random): class java.util.Random
14:10:03.565 [main] DEBUG org.apache.ranger.admin.client.RangerAdminRESTClient -- <== RangerAdminRESTClient.init(http://127.25.254.212:44445, /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ranger-kms-policymgr-ssl.xml)
14:10:03.565 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.URLEncoderUtil)
14:10:03.565 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.URLEncoderUtil): calling childClassLoader.findClass()
14:10:03.565 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.URLEncoderUtil)
14:10:03.565 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.URLEncoderUtil): calling childClassLoader().findClass() 
14:10:03.565 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.URLEncoderUtil): class org.apache.ranger.plugin.util.URLEncoderUtil
14:10:03.565 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.URLEncoderUtil): class org.apache.ranger.plugin.util.URLEncoderUtil
14:10:03.565 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.net.URLEncoder)
14:10:03.565 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.net.URLEncoder): calling childClassLoader.findClass()
14:10:03.565 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.net.URLEncoder): class java.net.URLEncoder
14:10:03.566 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPluginContext -- <== RangerBasePlugin.createAdminClient(kms, kms, ranger.plugin.kms): policySourceImpl=org.apache.ranger.admin.client.RangerAdminRESTClient, client=org.apache.ranger.admin.client.RangerAdminRESTClient@7dc2e4be
14:10:03.566 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRolesProvider)
14:10:03.566 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRolesProvider): calling childClassLoader.findClass()
14:10:03.566 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRolesProvider)
14:10:03.566 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRolesProvider): calling childClassLoader().findClass() 
14:10:03.566 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRolesProvider): class org.apache.ranger.plugin.util.RangerRolesProvider
14:10:03.566 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRolesProvider): class org.apache.ranger.plugin.util.RangerRolesProvider
14:10:03.567 [main] DEBUG org.apache.ranger.plugin.util.RangerRolesProvider -- ==> RangerRolesProvider(serviceName=kms).RangerRolesProvider()
14:10:03.567 [main] DEBUG org.apache.ranger.plugin.util.RangerRolesProvider -- <== RangerRolesProvider(serviceName=kms).RangerRolesProvider()
14:10:03.567 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- <== PolicyRefresher(serviceName=kms).PolicyRefresher()
14:10:03.567 [main] INFO org.apache.ranger.plugin.service.RangerBasePlugin -- Created PolicyRefresher Thread(PolicyRefresher(serviceName=kms)-24)
14:10:03.567 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- ==> PolicyRefresher(serviceName=kms).loadRoles()
14:10:03.567 [main] DEBUG org.apache.ranger.plugin.util.RangerRolesProvider -- ==> RangerRolesProvider(serviceName= kms serviceType= kms).loadUserGroupRoles()
14:10:03.567 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPerfTracerFactory)
14:10:03.567 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPerfTracerFactory): calling childClassLoader.findClass()
14:10:03.567 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPerfTracerFactory)
14:10:03.567 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPerfTracerFactory): calling childClassLoader().findClass() 
14:10:03.568 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPerfTracerFactory): class org.apache.ranger.plugin.util.RangerPerfTracerFactory
14:10:03.568 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPerfTracerFactory): class org.apache.ranger.plugin.util.RangerPerfTracerFactory
14:10:03.568 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPerfCollectorTracer)
14:10:03.568 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPerfCollectorTracer): calling childClassLoader.findClass()
14:10:03.568 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPerfCollectorTracer)
14:10:03.568 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPerfCollectorTracer): calling childClassLoader().findClass() 
14:10:03.568 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPerfCollectorTracer): class org.apache.ranger.plugin.util.RangerPerfCollectorTracer
14:10:03.568 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPerfCollectorTracer): class org.apache.ranger.plugin.util.RangerPerfCollectorTracer
14:10:03.568 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.management.ManagementFactory)
14:10:03.568 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.management.ManagementFactory): calling childClassLoader.findClass()
14:10:03.568 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.management.ManagementFactory): class java.lang.management.ManagementFactory
14:10:03.568 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.management.ThreadMXBean)
14:10:03.568 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.management.ThreadMXBean): calling childClassLoader.findClass()
14:10:03.568 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.management.ThreadMXBean): interface java.lang.management.ThreadMXBean
14:10:03.568 [main] INFO org.apache.ranger.perf.policyengine.init -- ThreadCPUTimeSupported (by JVM)  = true
14:10:03.568 [main] INFO org.apache.ranger.perf.policyengine.init -- ThreadCPUTimeEnabled  = true
14:10:03.568 [main] INFO org.apache.ranger.perf.policyengine.init -- ThreadCPUTimeEnabled  = true
14:10:03.569 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.management.ThreadInfo)
14:10:03.569 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.management.ThreadInfo): calling childClassLoader.findClass()
14:10:03.569 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.management.ThreadInfo): class java.lang.management.ThreadInfo
14:10:03.569 [main] DEBUG org.apache.ranger.perf.policyengine.init -- In-Use memory: 105559704, Free memory:202721640
14:10:03.569 [main] DEBUG org.apache.ranger.plugin.util.RangerRolesProvider -- ==> RangerRolesProvider(serviceName=kms).loadUserGroupRolesFromAdmin()
14:10:03.569 [main] DEBUG org.apache.ranger.admin.client.RangerAdminRESTClient -- ==> RangerAdminRESTClient.getRolesIfUpdated(-1, 0)
14:10:03.569 [main] DEBUG org.apache.ranger.admin.client.RangerAdminRESTClient -- Checking Roles updated as user : rangerkms/127.25.254.212@KRBTEST.COM (auth:KERBEROS)
14:10:03.569 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.invoke.LambdaMetafactory)
14:10:03.569 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.invoke.LambdaMetafactory): calling childClassLoader.findClass()
14:10:03.569 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.invoke.LambdaMetafactory): class java.lang.invoke.LambdaMetafactory
14:10:03.569 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.security.PrivilegedExceptionAction)
14:10:03.569 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.security.PrivilegedExceptionAction): calling childClassLoader.findClass()
14:10:03.569 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.security.PrivilegedExceptionAction): interface java.security.PrivilegedExceptionAction
14:10:03.569 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.ClientResponse)
14:10:03.569 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.ClientResponse): calling childClassLoader.findClass()
14:10:03.569 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.api.client.ClientResponse)
14:10:03.569 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.ClientResponse): calling childClassLoader().findClass() 
14:10:03.569 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.ClientResponse): calling componentClassLoader.findClass()
14:10:03.569 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.ClientResponse): calling componentClassLoader.loadClass()
14:10:03.571 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.ClientResponse): class com.sun.jersey.api.client.ClientResponse
14:10:03.572 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: rangerkms/127.25.254.212@KRBTEST.COM (auth:KERBEROS)][action: org.apache.ranger.admin.client.RangerAdminRESTClient$$Lambda$157/0x00007f6bcc296ba8@150f6ba9]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.ranger.audit.provider.MiscUtil.executePrivilegedAction(MiscUtil.java:560)
	at org.apache.ranger.admin.client.RangerAdminRESTClient.getRolesIfUpdated(RangerAdminRESTClient.java:221)
	at org.apache.ranger.plugin.util.RangerRolesProvider.loadUserGroupRolesFromAdmin(RangerRolesProvider.java:172)
	at org.apache.ranger.plugin.util.RangerRolesProvider.loadUserGroupRoles(RangerRolesProvider.java:112)
	at org.apache.ranger.plugin.util.PolicyRefresher.loadRoles(PolicyRefresher.java:563)
	at org.apache.ranger.plugin.util.PolicyRefresher.startRefresher(PolicyRefresher.java:138)
	at org.apache.ranger.plugin.service.RangerBasePlugin.init(RangerBasePlugin.java:310)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin.init(RangerKmsAuthorizer.java:346)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.init(RangerKmsAuthorizer.java:303)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.<init>(RangerKmsAuthorizer.java:127)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.<init>(RangerKmsAuthorizer.java:153)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500)
	at java.base/java.lang.reflect.ReflectAccess.newInstance(ReflectAccess.java:128)
	at java.base/jdk.internal.reflect.ReflectionFactory.newInstance(ReflectionFactory.java:347)
	at java.base/java.lang.Class.newInstance(Class.java:647)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.init(RangerKmsAuthorizer.java:70)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.<init>(RangerKmsAuthorizer.java:50)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500)
	at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:481)
	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
	at org.apache.hadoop.crypto.key.kms.server.KMSWebApp.getKeyAcls(KMSWebApp.java:254)
	at org.apache.hadoop.crypto.key.kms.server.KMSWebApp.contextInitialized(KMSWebApp.java:143)
	at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4018)
	at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:4460)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1203)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1193)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:145)
	at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:749)
	at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:721)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1203)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1193)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:145)
	at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:749)
	at org.apache.catalina.core.StandardEngine.startInternal(StandardEngine.java:211)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.StandardService.startInternal(StandardService.java:415)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.StandardServer.startInternal(StandardServer.java:874)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.startup.Tomcat.start(Tomcat.java:438)
	at org.apache.ranger.server.tomcat.EmbeddedServer.startServer(EmbeddedServer.java:351)
	at org.apache.ranger.server.tomcat.EmbeddedServer.start(EmbeddedServer.java:317)
	at org.apache.ranger.server.tomcat.EmbeddedServer.main(EmbeddedServer.java:95)
14:10:03.576 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.config.DefaultClientConfig)
14:10:03.576 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.config.DefaultClientConfig): calling childClassLoader.findClass()
14:10:03.576 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.api.client.config.DefaultClientConfig)
14:10:03.576 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.config.DefaultClientConfig): calling childClassLoader().findClass() 
14:10:03.576 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.config.DefaultClientConfig): calling componentClassLoader.findClass()
14:10:03.576 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.config.DefaultClientConfig): calling componentClassLoader.loadClass()
14:10:03.577 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.config.DefaultClientConfig): class com.sun.jersey.api.client.config.DefaultClientConfig
14:10:03.577 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider)
14:10:03.577 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider): calling childClassLoader.findClass()
14:10:03.577 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider)
14:10:03.577 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider): calling childClassLoader().findClass() 
14:10:03.578 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider): calling componentClassLoader.findClass()
14:10:03.578 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider): calling componentClassLoader.loadClass()
14:10:03.578 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider): class com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider
14:10:03.578 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.Client)
14:10:03.578 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.Client): calling childClassLoader.findClass()
14:10:03.578 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.api.client.Client)
14:10:03.578 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.Client): calling childClassLoader().findClass() 
14:10:03.578 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.Client): calling componentClassLoader.findClass()
14:10:03.578 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.Client): calling componentClassLoader.loadClass()
14:10:03.580 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.Client): class com.sun.jersey.api.client.Client
14:10:03.592 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/jersey-client-components) 
14:10:03.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/jersey-client-components): calling childClassLoader.findResources()
14:10:03.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/jersey-client-components): calling componentClassLoader.getResources()
14:10:03.594 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/jersey-client-components): java.lang.CompoundEnumeration@2ae01cc6
14:10:03.594 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/jersey-client-components) 
14:10:03.599 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider) 
14:10:03.599 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider): calling childClassLoader.findResources()
14:10:03.599 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider): calling componentClassLoader.getResources()
14:10:03.600 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider): java.lang.CompoundEnumeration@4b754316
14:10:03.600 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider) 
14:10:03.603 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider) 
14:10:03.603 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider): calling childClassLoader.findResources()
14:10:03.603 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider): calling componentClassLoader.getResources()
14:10:03.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider): java.lang.CompoundEnumeration@313bd59
14:10:03.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider) 
14:10:03.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.SAXParserContextProvider)
14:10:03.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.SAXParserContextProvider): calling childClassLoader.findClass()
14:10:03.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.SAXParserContextProvider)
14:10:03.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.SAXParserContextProvider): calling childClassLoader().findClass() 
14:10:03.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.SAXParserContextProvider): calling componentClassLoader.findClass()
14:10:03.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.SAXParserContextProvider): calling componentClassLoader.loadClass()
14:10:03.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.SAXParserContextProvider): class com.sun.jersey.core.impl.provider.xml.SAXParserContextProvider
14:10:03.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.XMLStreamReaderContextProvider)
14:10:03.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.XMLStreamReaderContextProvider): calling childClassLoader.findClass()
14:10:03.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.XMLStreamReaderContextProvider)
14:10:03.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.XMLStreamReaderContextProvider): calling childClassLoader().findClass() 
14:10:03.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.XMLStreamReaderContextProvider): calling componentClassLoader.findClass()
14:10:03.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.XMLStreamReaderContextProvider): calling componentClassLoader.loadClass()
14:10:03.610 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.XMLStreamReaderContextProvider): class com.sun.jersey.core.impl.provider.xml.XMLStreamReaderContextProvider
14:10:03.610 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.DocumentBuilderFactoryProvider)
14:10:03.610 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.DocumentBuilderFactoryProvider): calling childClassLoader.findClass()
14:10:03.610 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.DocumentBuilderFactoryProvider)
14:10:03.610 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.DocumentBuilderFactoryProvider): calling childClassLoader().findClass() 
14:10:03.610 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.DocumentBuilderFactoryProvider): calling componentClassLoader.findClass()
14:10:03.610 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.DocumentBuilderFactoryProvider): calling componentClassLoader.loadClass()
14:10:03.611 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.DocumentBuilderFactoryProvider): class com.sun.jersey.core.impl.provider.xml.DocumentBuilderFactoryProvider
14:10:03.611 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.TransformerFactoryProvider)
14:10:03.611 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.TransformerFactoryProvider): calling childClassLoader.findClass()
14:10:03.611 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.TransformerFactoryProvider)
14:10:03.611 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.TransformerFactoryProvider): calling childClassLoader().findClass() 
14:10:03.612 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.TransformerFactoryProvider): calling componentClassLoader.findClass()
14:10:03.612 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.TransformerFactoryProvider): calling componentClassLoader.loadClass()
14:10:03.613 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.TransformerFactoryProvider): class com.sun.jersey.core.impl.provider.xml.TransformerFactoryProvider
14:10:03.616 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.annotation.PostConstruct)
14:10:03.616 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.annotation.PostConstruct): calling childClassLoader.findClass()
14:10:03.616 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(javax.annotation.PostConstruct)
14:10:03.616 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.annotation.PostConstruct): calling childClassLoader().findClass() 
14:10:03.617 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.annotation.PostConstruct): calling componentClassLoader.findClass()
14:10:03.617 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.annotation.PostConstruct): calling componentClassLoader.loadClass()
14:10:03.617 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.annotation.PostConstruct): interface javax.annotation.PostConstruct
14:10:03.630 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.annotation.PreDestroy)
14:10:03.630 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.annotation.PreDestroy): calling childClassLoader.findClass()
14:10:03.630 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(javax.annotation.PreDestroy)
14:10:03.630 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.annotation.PreDestroy): calling childClassLoader().findClass() 
14:10:03.630 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.annotation.PreDestroy): calling componentClassLoader.findClass()
14:10:03.630 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.annotation.PreDestroy): calling componentClassLoader.loadClass()
14:10:03.631 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.annotation.PreDestroy): interface javax.annotation.PreDestroy
14:10:03.719 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(META-INF/services/javax.ws.rs.ext.RuntimeDelegate) 
14:10:03.719 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(META-INF/services/javax.ws.rs.ext.RuntimeDelegate): calling componentClassLoader.getResources()
14:10:03.720 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(META-INF/services/javax.ws.rs.ext.RuntimeDelegate): jar:file:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/lib/jersey-bundle-1.19.4.jar!/META-INF/services/javax.ws.rs.ext.RuntimeDelegate
14:10:03.720 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.server.impl.provider.RuntimeDelegateImpl)
14:10:03.720 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.server.impl.provider.RuntimeDelegateImpl): calling childClassLoader.findClass()
14:10:03.720 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.server.impl.provider.RuntimeDelegateImpl)
14:10:03.720 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.server.impl.provider.RuntimeDelegateImpl): calling childClassLoader().findClass() 
14:10:03.721 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.server.impl.provider.RuntimeDelegateImpl): calling componentClassLoader.findClass()
14:10:03.721 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.server.impl.provider.RuntimeDelegateImpl): calling componentClassLoader.loadClass()
14:10:03.723 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.server.impl.provider.RuntimeDelegateImpl): class com.sun.jersey.server.impl.provider.RuntimeDelegateImpl
14:10:03.727 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.spi.HeaderDelegateProvider) 
14:10:03.727 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/com.sun.jersey.spi.HeaderDelegateProvider): calling childClassLoader.findResources()
14:10:03.727 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.spi.HeaderDelegateProvider): calling componentClassLoader.getResources()
14:10:03.728 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.spi.HeaderDelegateProvider): java.lang.CompoundEnumeration@374a9827
14:10:03.728 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.spi.HeaderDelegateProvider) 
14:10:03.729 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.LocaleProvider)
14:10:03.730 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.LocaleProvider): calling childClassLoader.findClass()
14:10:03.730 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.LocaleProvider)
14:10:03.730 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.LocaleProvider): calling childClassLoader().findClass() 
14:10:03.730 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.LocaleProvider): calling componentClassLoader.findClass()
14:10:03.730 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.LocaleProvider): calling componentClassLoader.loadClass()
14:10:03.731 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.LocaleProvider): class com.sun.jersey.core.impl.provider.header.LocaleProvider
14:10:03.731 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.EntityTagProvider)
14:10:03.732 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.EntityTagProvider): calling childClassLoader.findClass()
14:10:03.732 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.EntityTagProvider)
14:10:03.732 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.EntityTagProvider): calling childClassLoader().findClass() 
14:10:03.732 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.EntityTagProvider): calling componentClassLoader.findClass()
14:10:03.732 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.EntityTagProvider): calling componentClassLoader.loadClass()
14:10:03.733 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.EntityTagProvider): class com.sun.jersey.core.impl.provider.header.EntityTagProvider
14:10:03.733 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.MediaTypeProvider)
14:10:03.733 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.MediaTypeProvider): calling childClassLoader.findClass()
14:10:03.733 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.MediaTypeProvider)
14:10:03.733 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.MediaTypeProvider): calling childClassLoader().findClass() 
14:10:03.733 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.MediaTypeProvider): calling componentClassLoader.findClass()
14:10:03.734 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.MediaTypeProvider): calling componentClassLoader.loadClass()
14:10:03.735 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.MediaTypeProvider): class com.sun.jersey.core.impl.provider.header.MediaTypeProvider
14:10:03.735 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.CacheControlProvider)
14:10:03.735 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.CacheControlProvider): calling childClassLoader.findClass()
14:10:03.735 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.CacheControlProvider)
14:10:03.735 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.CacheControlProvider): calling childClassLoader().findClass() 
14:10:03.735 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.CacheControlProvider): calling componentClassLoader.findClass()
14:10:03.735 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.CacheControlProvider): calling componentClassLoader.loadClass()
14:10:03.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.CacheControlProvider): class com.sun.jersey.core.impl.provider.header.CacheControlProvider
14:10:03.738 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.NewCookieProvider)
14:10:03.738 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.NewCookieProvider): calling childClassLoader.findClass()
14:10:03.738 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.NewCookieProvider)
14:10:03.738 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.NewCookieProvider): calling childClassLoader().findClass() 
14:10:03.738 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.NewCookieProvider): calling componentClassLoader.findClass()
14:10:03.738 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.NewCookieProvider): calling componentClassLoader.loadClass()
14:10:03.739 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.NewCookieProvider): class com.sun.jersey.core.impl.provider.header.NewCookieProvider
14:10:03.740 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.CookieProvider)
14:10:03.740 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.CookieProvider): calling childClassLoader.findClass()
14:10:03.740 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.CookieProvider)
14:10:03.740 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.CookieProvider): calling childClassLoader().findClass() 
14:10:03.740 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.CookieProvider): calling componentClassLoader.findClass()
14:10:03.740 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.CookieProvider): calling componentClassLoader.loadClass()
14:10:03.741 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.CookieProvider): class com.sun.jersey.core.impl.provider.header.CookieProvider
14:10:03.741 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.URIProvider)
14:10:03.741 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.URIProvider): calling childClassLoader.findClass()
14:10:03.741 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.URIProvider)
14:10:03.742 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.URIProvider): calling childClassLoader().findClass() 
14:10:03.742 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.URIProvider): calling componentClassLoader.findClass()
14:10:03.742 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.URIProvider): calling componentClassLoader.loadClass()
14:10:03.743 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.URIProvider): class com.sun.jersey.core.impl.provider.header.URIProvider
14:10:03.743 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.DateProvider)
14:10:03.743 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.DateProvider): calling childClassLoader.findClass()
14:10:03.743 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.DateProvider)
14:10:03.743 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.DateProvider): calling childClassLoader().findClass() 
14:10:03.743 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.DateProvider): calling componentClassLoader.findClass()
14:10:03.743 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.DateProvider): calling componentClassLoader.loadClass()
14:10:03.745 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.DateProvider): class com.sun.jersey.core.impl.provider.header.DateProvider
14:10:03.745 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.StringProvider)
14:10:03.745 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.StringProvider): calling childClassLoader.findClass()
14:10:03.745 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.StringProvider)
14:10:03.745 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.StringProvider): calling childClassLoader().findClass() 
14:10:03.745 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.StringProvider): calling componentClassLoader.findClass()
14:10:03.745 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.StringProvider): calling componentClassLoader.loadClass()
14:10:03.746 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.StringProvider): class com.sun.jersey.core.impl.provider.header.StringProvider
14:10:03.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/javax.ws.rs.ext.MessageBodyReader) 
14:10:03.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyReader): calling childClassLoader.findResources()
14:10:03.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyReader): calling componentClassLoader.getResources()
14:10:03.758 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyReader): java.lang.CompoundEnumeration@69d4964c
14:10:03.758 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/javax.ws.rs.ext.MessageBodyReader) 
14:10:03.760 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.StringProvider)
14:10:03.760 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.StringProvider): calling childClassLoader.findClass()
14:10:03.760 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.StringProvider)
14:10:03.760 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.StringProvider): calling childClassLoader().findClass() 
14:10:03.761 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.StringProvider): calling componentClassLoader.findClass()
14:10:03.761 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.StringProvider): calling componentClassLoader.loadClass()
14:10:03.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.StringProvider): class com.sun.jersey.core.impl.provider.entity.StringProvider
14:10:03.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.ByteArrayProvider)
14:10:03.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.ByteArrayProvider): calling childClassLoader.findClass()
14:10:03.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.ByteArrayProvider)
14:10:03.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.ByteArrayProvider): calling childClassLoader().findClass() 
14:10:03.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.ByteArrayProvider): calling componentClassLoader.findClass()
14:10:03.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.ByteArrayProvider): calling componentClassLoader.loadClass()
14:10:03.764 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.ByteArrayProvider): class com.sun.jersey.core.impl.provider.entity.ByteArrayProvider
14:10:03.764 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FileProvider)
14:10:03.764 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FileProvider): calling childClassLoader.findClass()
14:10:03.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.FileProvider)
14:10:03.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.FileProvider): calling childClassLoader().findClass() 
14:10:03.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.FileProvider): calling componentClassLoader.findClass()
14:10:03.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FileProvider): calling componentClassLoader.loadClass()
14:10:03.766 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FileProvider): class com.sun.jersey.core.impl.provider.entity.FileProvider
14:10:03.766 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.InputStreamProvider)
14:10:03.766 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.InputStreamProvider): calling childClassLoader.findClass()
14:10:03.766 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.InputStreamProvider)
14:10:03.766 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.InputStreamProvider): calling childClassLoader().findClass() 
14:10:03.766 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.InputStreamProvider): calling componentClassLoader.findClass()
14:10:03.766 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.InputStreamProvider): calling componentClassLoader.loadClass()
14:10:03.768 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.InputStreamProvider): class com.sun.jersey.core.impl.provider.entity.InputStreamProvider
14:10:03.768 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.DataSourceProvider)
14:10:03.768 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.DataSourceProvider): calling childClassLoader.findClass()
14:10:03.768 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.DataSourceProvider)
14:10:03.768 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.DataSourceProvider): calling childClassLoader().findClass() 
14:10:03.768 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.DataSourceProvider): calling componentClassLoader.findClass()
14:10:03.768 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.DataSourceProvider): calling componentClassLoader.loadClass()
14:10:03.769 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.DataSourceProvider): class com.sun.jersey.core.impl.provider.entity.DataSourceProvider
14:10:03.769 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.RenderedImageProvider)
14:10:03.769 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.RenderedImageProvider): calling childClassLoader.findClass()
14:10:03.769 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.RenderedImageProvider)
14:10:03.769 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.RenderedImageProvider): calling childClassLoader().findClass() 
14:10:03.769 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.RenderedImageProvider): calling componentClassLoader.findClass()
14:10:03.769 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.RenderedImageProvider): calling componentClassLoader.loadClass()
14:10:03.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.RenderedImageProvider): class com.sun.jersey.core.impl.provider.entity.RenderedImageProvider
14:10:03.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.MimeMultipartProvider)
14:10:03.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.MimeMultipartProvider): calling childClassLoader.findClass()
14:10:03.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.MimeMultipartProvider)
14:10:03.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.MimeMultipartProvider): calling childClassLoader().findClass() 
14:10:03.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.MimeMultipartProvider): calling componentClassLoader.findClass()
14:10:03.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.MimeMultipartProvider): calling componentClassLoader.loadClass()
14:10:03.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.MimeMultipartProvider): class com.sun.jersey.core.impl.provider.entity.MimeMultipartProvider
14:10:03.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FormProvider)
14:10:03.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FormProvider): calling childClassLoader.findClass()
14:10:03.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.FormProvider)
14:10:03.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.FormProvider): calling childClassLoader().findClass() 
14:10:03.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.FormProvider): calling componentClassLoader.findClass()
14:10:03.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FormProvider): calling componentClassLoader.loadClass()
14:10:03.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FormProvider): class com.sun.jersey.core.impl.provider.entity.FormProvider
14:10:03.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FormMultivaluedMapProvider)
14:10:03.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FormMultivaluedMapProvider): calling childClassLoader.findClass()
14:10:03.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.FormMultivaluedMapProvider)
14:10:03.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.FormMultivaluedMapProvider): calling childClassLoader().findClass() 
14:10:03.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.FormMultivaluedMapProvider): calling componentClassLoader.findClass()
14:10:03.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FormMultivaluedMapProvider): calling componentClassLoader.loadClass()
14:10:03.775 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FormMultivaluedMapProvider): class com.sun.jersey.core.impl.provider.entity.FormMultivaluedMapProvider
14:10:03.775 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$App)
14:10:03.775 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$App): calling childClassLoader.findClass()
14:10:03.775 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$App)
14:10:03.775 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$App): calling childClassLoader().findClass() 
14:10:03.775 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$App): calling componentClassLoader.findClass()
14:10:03.775 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$App): calling componentClassLoader.loadClass()
14:10:03.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$App): class com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$App
14:10:03.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$Text)
14:10:03.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$Text): calling childClassLoader.findClass()
14:10:03.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$Text)
14:10:03.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$Text): calling childClassLoader().findClass() 
14:10:03.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$Text): calling componentClassLoader.findClass()
14:10:03.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$Text): calling componentClassLoader.loadClass()
14:10:03.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$Text): class com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$Text
14:10:03.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$General)
14:10:03.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$General): calling childClassLoader.findClass()
14:10:03.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$General)
14:10:03.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$General): calling childClassLoader().findClass() 
14:10:03.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$General): calling componentClassLoader.findClass()
14:10:03.779 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$General): calling componentClassLoader.loadClass()
14:10:03.780 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$General): class com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$General
14:10:03.780 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$App)
14:10:03.780 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$App): calling childClassLoader.findClass()
14:10:03.780 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$App)
14:10:03.780 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$App): calling childClassLoader().findClass() 
14:10:03.780 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$App): calling componentClassLoader.findClass()
14:10:03.780 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$App): calling componentClassLoader.loadClass()
14:10:03.782 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$App): class com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$App
14:10:03.782 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$Text)
14:10:03.782 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$Text): calling childClassLoader.findClass()
14:10:03.782 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$Text)
14:10:03.782 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$Text): calling childClassLoader().findClass() 
14:10:03.782 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$Text): calling componentClassLoader.findClass()
14:10:03.782 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$Text): calling componentClassLoader.loadClass()
14:10:03.783 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$Text): class com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$Text
14:10:03.783 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$General)
14:10:03.783 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$General): calling childClassLoader.findClass()
14:10:03.783 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$General)
14:10:03.783 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$General): calling childClassLoader().findClass() 
14:10:03.783 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$General): calling componentClassLoader.findClass()
14:10:03.783 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$General): calling componentClassLoader.loadClass()
14:10:03.784 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$General): class com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$General
14:10:03.785 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$App)
14:10:03.785 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$App): calling childClassLoader.findClass()
14:10:03.785 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$App)
14:10:03.785 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$App): calling childClassLoader().findClass() 
14:10:03.785 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$App): calling componentClassLoader.findClass()
14:10:03.785 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$App): calling componentClassLoader.loadClass()
14:10:03.787 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$App): class com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$App
14:10:03.787 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$Text)
14:10:03.787 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$Text): calling childClassLoader.findClass()
14:10:03.787 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$Text)
14:10:03.787 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$Text): calling childClassLoader().findClass() 
14:10:03.787 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$Text): calling componentClassLoader.findClass()
14:10:03.787 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$Text): calling componentClassLoader.loadClass()
14:10:03.789 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$Text): class com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$Text
14:10:03.789 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$General)
14:10:03.789 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$General): calling childClassLoader.findClass()
14:10:03.789 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$General)
14:10:03.789 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$General): calling childClassLoader().findClass() 
14:10:03.789 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$General): calling componentClassLoader.findClass()
14:10:03.789 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$General): calling componentClassLoader.loadClass()
14:10:03.790 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$General): class com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$General
14:10:03.790 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.ReaderProvider)
14:10:03.790 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.ReaderProvider): calling childClassLoader.findClass()
14:10:03.790 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.ReaderProvider)
14:10:03.790 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.ReaderProvider): calling childClassLoader().findClass() 
14:10:03.790 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.ReaderProvider): calling componentClassLoader.findClass()
14:10:03.790 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.ReaderProvider): calling componentClassLoader.loadClass()
14:10:03.792 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.ReaderProvider): class com.sun.jersey.core.impl.provider.entity.ReaderProvider
14:10:03.792 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.DocumentProvider)
14:10:03.792 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.DocumentProvider): calling childClassLoader.findClass()
14:10:03.792 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.DocumentProvider)
14:10:03.792 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.DocumentProvider): calling childClassLoader().findClass() 
14:10:03.792 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.DocumentProvider): calling componentClassLoader.findClass()
14:10:03.792 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.DocumentProvider): calling componentClassLoader.loadClass()
14:10:03.793 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.DocumentProvider): class com.sun.jersey.core.impl.provider.entity.DocumentProvider
14:10:03.793 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$StreamSourceReader)
14:10:03.793 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$StreamSourceReader): calling childClassLoader.findClass()
14:10:03.793 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$StreamSourceReader)
14:10:03.793 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$StreamSourceReader): calling childClassLoader().findClass() 
14:10:03.793 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$StreamSourceReader): calling componentClassLoader.findClass()
14:10:03.794 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$StreamSourceReader): calling componentClassLoader.loadClass()
14:10:03.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$StreamSourceReader): class com.sun.jersey.core.impl.provider.entity.SourceProvider$StreamSourceReader
14:10:03.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SAXSourceReader)
14:10:03.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SAXSourceReader): calling childClassLoader.findClass()
14:10:03.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SAXSourceReader)
14:10:03.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SAXSourceReader): calling childClassLoader().findClass() 
14:10:03.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SAXSourceReader): calling componentClassLoader.findClass()
14:10:03.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SAXSourceReader): calling componentClassLoader.loadClass()
14:10:03.796 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SAXSourceReader): class com.sun.jersey.core.impl.provider.entity.SourceProvider$SAXSourceReader
14:10:03.797 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$DOMSourceReader)
14:10:03.797 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$DOMSourceReader): calling childClassLoader.findClass()
14:10:03.797 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$DOMSourceReader)
14:10:03.797 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$DOMSourceReader): calling childClassLoader().findClass() 
14:10:03.797 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$DOMSourceReader): calling componentClassLoader.findClass()
14:10:03.797 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$DOMSourceReader): calling componentClassLoader.loadClass()
14:10:03.798 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$DOMSourceReader): class com.sun.jersey.core.impl.provider.entity.SourceProvider$DOMSourceReader
14:10:03.798 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$App)
14:10:03.798 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$App): calling childClassLoader.findClass()
14:10:03.798 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$App)
14:10:03.798 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$App): calling childClassLoader().findClass() 
14:10:03.798 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$App): calling componentClassLoader.findClass()
14:10:03.798 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$App): calling componentClassLoader.loadClass()
14:10:03.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$App): class com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$App
14:10:03.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$Text)
14:10:03.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$Text): calling childClassLoader.findClass()
14:10:03.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$Text)
14:10:03.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$Text): calling childClassLoader().findClass() 
14:10:03.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$Text): calling componentClassLoader.findClass()
14:10:03.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$Text): calling componentClassLoader.loadClass()
14:10:03.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$Text): class com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$Text
14:10:03.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$General)
14:10:03.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$General): calling childClassLoader.findClass()
14:10:03.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$General)
14:10:03.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$General): calling childClassLoader().findClass() 
14:10:03.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$General): calling componentClassLoader.findClass()
14:10:03.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$General): calling componentClassLoader.loadClass()
14:10:03.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$General): class com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$General
14:10:03.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.EntityHolderReader)
14:10:03.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.EntityHolderReader): calling childClassLoader.findClass()
14:10:03.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.EntityHolderReader)
14:10:03.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.EntityHolderReader): calling childClassLoader().findClass() 
14:10:03.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.EntityHolderReader): calling componentClassLoader.findClass()
14:10:03.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.EntityHolderReader): calling componentClassLoader.loadClass()
14:10:03.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.EntityHolderReader): class com.sun.jersey.core.impl.provider.entity.EntityHolderReader
14:10:03.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomFeedProvider)
14:10:03.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomFeedProvider): calling childClassLoader.findClass()
14:10:03.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomFeedProvider)
14:10:03.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomFeedProvider): calling childClassLoader().findClass() 
14:10:03.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomFeedProvider): calling componentClassLoader.findClass()
14:10:03.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomFeedProvider): calling componentClassLoader.loadClass()
14:10:03.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomFeedProvider): class com.sun.jersey.atom.rome.impl.provider.entity.AtomFeedProvider
14:10:03.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomEntryProvider)
14:10:03.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomEntryProvider): calling childClassLoader.findClass()
14:10:03.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomEntryProvider)
14:10:03.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomEntryProvider): calling childClassLoader().findClass() 
14:10:03.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomEntryProvider): calling componentClassLoader.findClass()
14:10:03.806 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomEntryProvider): calling componentClassLoader.loadClass()
14:10:03.807 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomEntryProvider): class com.sun.jersey.atom.rome.impl.provider.entity.AtomEntryProvider
14:10:03.807 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$Wadl)
14:10:03.807 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$Wadl): calling childClassLoader.findClass()
14:10:03.807 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$Wadl)
14:10:03.807 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$Wadl): calling childClassLoader().findClass() 
14:10:03.807 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$Wadl): calling componentClassLoader.findClass()
14:10:03.807 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$Wadl): calling componentClassLoader.loadClass()
14:10:03.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$Wadl): class com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$Wadl
14:10:03.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$App)
14:10:03.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$App): calling childClassLoader.findClass()
14:10:03.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$App)
14:10:03.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$App): calling childClassLoader().findClass() 
14:10:03.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$App): calling componentClassLoader.findClass()
14:10:03.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$App): calling componentClassLoader.loadClass()
14:10:03.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$App): class com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$App
14:10:03.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$General)
14:10:03.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$General): calling childClassLoader.findClass()
14:10:03.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$General)
14:10:03.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$General): calling childClassLoader().findClass() 
14:10:03.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$General): calling componentClassLoader.findClass()
14:10:03.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$General): calling componentClassLoader.loadClass()
14:10:03.810 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$General): class com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$General
14:10:03.810 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$App)
14:10:03.810 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$App): calling childClassLoader.findClass()
14:10:03.810 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$App)
14:10:03.810 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$App): calling childClassLoader().findClass() 
14:10:03.810 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$App): calling componentClassLoader.findClass()
14:10:03.810 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$App): calling componentClassLoader.loadClass()
14:10:03.811 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$App): class com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$App
14:10:03.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$General)
14:10:03.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$General): calling childClassLoader.findClass()
14:10:03.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$General)
14:10:03.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$General): calling childClassLoader().findClass() 
14:10:03.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$General): calling componentClassLoader.findClass()
14:10:03.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$General): calling componentClassLoader.loadClass()
14:10:03.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$General): class com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$General
14:10:03.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$App)
14:10:03.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$App): calling childClassLoader.findClass()
14:10:03.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$App)
14:10:03.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$App): calling childClassLoader().findClass() 
14:10:03.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$App): calling componentClassLoader.findClass()
14:10:03.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$App): calling componentClassLoader.loadClass()
14:10:03.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$App): class com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$App
14:10:03.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$General)
14:10:03.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$General): calling childClassLoader.findClass()
14:10:03.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$General)
14:10:03.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$General): calling childClassLoader().findClass() 
14:10:03.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$General): calling componentClassLoader.findClass()
14:10:03.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$General): calling componentClassLoader.loadClass()
14:10:03.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$General): class com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$General
14:10:03.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$App)
14:10:03.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$App): calling childClassLoader.findClass()
14:10:03.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$App)
14:10:03.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$App): calling childClassLoader().findClass() 
14:10:03.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$App): calling componentClassLoader.findClass()
14:10:03.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$App): calling componentClassLoader.loadClass()
14:10:03.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$App): class com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$App
14:10:03.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$General)
14:10:03.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$General): calling childClassLoader.findClass()
14:10:03.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$General)
14:10:03.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$General): calling childClassLoader().findClass() 
14:10:03.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$General): calling componentClassLoader.findClass()
14:10:03.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$General): calling componentClassLoader.loadClass()
14:10:03.819 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$General): class com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$General
14:10:03.819 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$App)
14:10:03.819 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$App): calling childClassLoader.findClass()
14:10:03.819 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$App)
14:10:03.819 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$App): calling childClassLoader().findClass() 
14:10:03.819 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$App): calling componentClassLoader.findClass()
14:10:03.819 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$App): calling componentClassLoader.loadClass()
14:10:03.820 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$App): class com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$App
14:10:03.820 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$General)
14:10:03.820 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$General): calling childClassLoader.findClass()
14:10:03.820 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$General)
14:10:03.820 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$General): calling childClassLoader().findClass() 
14:10:03.820 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$General): calling componentClassLoader.findClass()
14:10:03.820 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$General): calling componentClassLoader.loadClass()
14:10:03.821 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$General): class com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$General
14:10:03.821 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JacksonProviderProxy)
14:10:03.821 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JacksonProviderProxy): calling childClassLoader.findClass()
14:10:03.821 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JacksonProviderProxy)
14:10:03.821 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JacksonProviderProxy): calling childClassLoader().findClass() 
14:10:03.821 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JacksonProviderProxy): calling componentClassLoader.findClass()
14:10:03.821 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JacksonProviderProxy): calling componentClassLoader.loadClass()
14:10:03.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JacksonProviderProxy): class com.sun.jersey.json.impl.provider.entity.JacksonProviderProxy
14:10:03.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetRootElementProvider)
14:10:03.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetRootElementProvider): calling childClassLoader.findClass()
14:10:03.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetRootElementProvider)
14:10:03.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetRootElementProvider): calling childClassLoader().findClass() 
14:10:03.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetRootElementProvider): calling componentClassLoader.findClass()
14:10:03.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetRootElementProvider): calling componentClassLoader.loadClass()
14:10:03.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetRootElementProvider): class com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetRootElementProvider
14:10:03.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetJAXBElementProvider)
14:10:03.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetJAXBElementProvider): calling childClassLoader.findClass()
14:10:03.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetJAXBElementProvider)
14:10:03.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetJAXBElementProvider): calling childClassLoader().findClass() 
14:10:03.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetJAXBElementProvider): calling componentClassLoader.findClass()
14:10:03.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetJAXBElementProvider): calling componentClassLoader.loadClass()
14:10:03.825 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetJAXBElementProvider): class com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetJAXBElementProvider
14:10:03.825 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetListElementProvider)
14:10:03.825 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetListElementProvider): calling childClassLoader.findClass()
14:10:03.825 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetListElementProvider)
14:10:03.825 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetListElementProvider): calling childClassLoader().findClass() 
14:10:03.825 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetListElementProvider): calling componentClassLoader.findClass()
14:10:03.825 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetListElementProvider): calling componentClassLoader.loadClass()
14:10:03.826 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetListElementProvider): class com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetListElementProvider
14:10:03.965 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/javax.ws.rs.ext.MessageBodyWriter) 
14:10:03.965 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyWriter): calling childClassLoader.findResources()
14:10:03.965 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyWriter): calling componentClassLoader.getResources()
14:10:03.966 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyWriter): java.lang.CompoundEnumeration@5c35bb40
14:10:03.966 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/javax.ws.rs.ext.MessageBodyWriter) 
14:10:03.968 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider)
14:10:03.968 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider): calling childClassLoader.findClass()
14:10:03.968 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider)
14:10:03.968 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider): calling childClassLoader().findClass() 
14:10:03.968 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider): calling componentClassLoader.findClass()
14:10:03.969 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider): calling componentClassLoader.loadClass()
14:10:03.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider): class com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider
14:10:03.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SourceWriter)
14:10:03.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SourceWriter): calling childClassLoader.findClass()
14:10:03.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SourceWriter)
14:10:03.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SourceWriter): calling childClassLoader().findClass() 
14:10:03.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SourceWriter): calling componentClassLoader.findClass()
14:10:03.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SourceWriter): calling componentClassLoader.loadClass()
14:10:03.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SourceWriter): class com.sun.jersey.core.impl.provider.entity.SourceProvider$SourceWriter
14:10:03.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.server.impl.template.ViewableMessageBodyWriter)
14:10:03.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.server.impl.template.ViewableMessageBodyWriter): calling childClassLoader.findClass()
14:10:03.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.server.impl.template.ViewableMessageBodyWriter)
14:10:03.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.server.impl.template.ViewableMessageBodyWriter): calling childClassLoader().findClass() 
14:10:03.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.server.impl.template.ViewableMessageBodyWriter): calling componentClassLoader.findClass()
14:10:03.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.server.impl.template.ViewableMessageBodyWriter): calling componentClassLoader.loadClass()
14:10:03.973 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.server.impl.template.ViewableMessageBodyWriter): class com.sun.jersey.server.impl.template.ViewableMessageBodyWriter
14:10:03.974 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONWithPaddingProvider)
14:10:03.974 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONWithPaddingProvider): calling childClassLoader.findClass()
14:10:03.974 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONWithPaddingProvider)
14:10:03.974 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONWithPaddingProvider): calling childClassLoader().findClass() 
14:10:03.974 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONWithPaddingProvider): calling componentClassLoader.findClass()
14:10:03.974 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONWithPaddingProvider): calling componentClassLoader.loadClass()
14:10:03.975 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONWithPaddingProvider): class com.sun.jersey.json.impl.provider.entity.JSONWithPaddingProvider
14:10:04.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Map$Entry)
14:10:04.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Map$Entry): calling childClassLoader.findClass()
14:10:04.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Map$Entry): interface java.util.Map$Entry
14:10:04.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.WebResource)
14:10:04.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.WebResource): calling childClassLoader.findClass()
14:10:04.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.api.client.WebResource)
14:10:04.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.WebResource): calling childClassLoader().findClass() 
14:10:04.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.WebResource): calling componentClassLoader.findClass()
14:10:04.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.WebResource): calling componentClassLoader.loadClass()
14:10:04.017 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.WebResource): class com.sun.jersey.api.client.WebResource
14:10:04.023 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.WebResource$Builder)
14:10:04.023 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.WebResource$Builder): calling childClassLoader.findClass()
14:10:04.023 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.api.client.WebResource$Builder)
14:10:04.023 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.WebResource$Builder): calling childClassLoader().findClass() 
14:10:04.024 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.WebResource$Builder): calling componentClassLoader.findClass()
14:10:04.024 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.WebResource$Builder): calling componentClassLoader.loadClass()
14:10:04.024 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.WebResource$Builder): class com.sun.jersey.api.client.WebResource$Builder
May 04 14:10:04 dist-test-slave-2x32 krb5kdc[5621](info): TGS_REQ (1 etypes {17}) 127.0.0.1: ISSUE: authtime 1777903803, etypes {rep=17 tkt=17 ses=17}, rangerkms/127.25.254.212@KRBTEST.COM for HTTP/127.25.254.212@KRBTEST.COM
14:10:04.348 [main] DEBUG org.apache.ranger.admin.client.RangerAdminRESTClient -- checkAndResetSessionCookie(): status=200, sessionIdCookie=null, newCookie=RANGERADMINSESSIONID=8DC9DE69523BFF81AE77F6EDBB0B4462;Version=1;Path=/
14:10:04.349 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRoles)
14:10:04.349 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRoles): calling childClassLoader.findClass()
14:10:04.349 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRoles)
14:10:04.349 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRoles): calling childClassLoader().findClass() 
14:10:04.350 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRoles): class org.apache.ranger.plugin.util.RangerRoles
14:10:04.350 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRoles): class org.apache.ranger.plugin.util.RangerRoles
14:10:04.350 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.JsonUtilsV2)
14:10:04.350 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.JsonUtilsV2): calling childClassLoader.findClass()
14:10:04.350 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.JsonUtilsV2)
14:10:04.350 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.JsonUtilsV2): calling childClassLoader().findClass() 
14:10:04.350 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.JsonUtilsV2): class org.apache.ranger.plugin.util.JsonUtilsV2
14:10:04.350 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.JsonUtilsV2): class org.apache.ranger.plugin.util.JsonUtilsV2
14:10:04.350 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.JsonUtilsV2$1)
14:10:04.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.JsonUtilsV2$1): calling childClassLoader.findClass()
14:10:04.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.JsonUtilsV2$1)
14:10:04.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.JsonUtilsV2$1): calling childClassLoader().findClass() 
14:10:04.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.JsonUtilsV2$1): class org.apache.ranger.plugin.util.JsonUtilsV2$1
14:10:04.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.JsonUtilsV2$1): class org.apache.ranger.plugin.util.JsonUtilsV2$1
14:10:04.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.JsonUtilsV2$2)
14:10:04.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.JsonUtilsV2$2): calling childClassLoader.findClass()
14:10:04.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.JsonUtilsV2$2)
14:10:04.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.JsonUtilsV2$2): calling childClassLoader().findClass() 
14:10:04.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.JsonUtilsV2$2): class org.apache.ranger.plugin.util.JsonUtilsV2$2
14:10:04.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.JsonUtilsV2$2): class org.apache.ranger.plugin.util.JsonUtilsV2$2
14:10:04.355 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.databind.ObjectMapper)
14:10:04.355 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.databind.ObjectMapper): calling childClassLoader.findClass()
14:10:04.355 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.databind.ObjectMapper)
14:10:04.355 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.databind.ObjectMapper): calling childClassLoader().findClass() 
14:10:04.355 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.databind.ObjectMapper): calling componentClassLoader.findClass()
14:10:04.355 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.databind.ObjectMapper): calling componentClassLoader.loadClass()
14:10:04.355 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.databind.ObjectMapper): class com.fasterxml.jackson.databind.ObjectMapper
14:10:04.516 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonAutoDetect)
14:10:04.516 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonAutoDetect): calling childClassLoader.findClass()
14:10:04.516 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonAutoDetect)
14:10:04.516 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonAutoDetect): calling childClassLoader().findClass() 
14:10:04.516 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonAutoDetect): calling componentClassLoader.findClass()
14:10:04.516 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonAutoDetect): calling componentClassLoader.loadClass()
14:10:04.517 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonAutoDetect): interface com.fasterxml.jackson.annotation.JsonAutoDetect
14:10:04.518 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonAutoDetect$Visibility)
14:10:04.518 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonAutoDetect$Visibility): calling childClassLoader.findClass()
14:10:04.518 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonAutoDetect$Visibility)
14:10:04.518 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonAutoDetect$Visibility): calling childClassLoader().findClass() 
14:10:04.518 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonAutoDetect$Visibility): calling componentClassLoader.findClass()
14:10:04.518 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonAutoDetect$Visibility): calling componentClassLoader.loadClass()
14:10:04.519 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonAutoDetect$Visibility): class com.fasterxml.jackson.annotation.JsonAutoDetect$Visibility
14:10:04.521 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonInclude)
14:10:04.521 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonInclude): calling childClassLoader.findClass()
14:10:04.521 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonInclude)
14:10:04.521 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonInclude): calling childClassLoader().findClass() 
14:10:04.521 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonInclude): calling componentClassLoader.findClass()
14:10:04.521 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonInclude): calling componentClassLoader.loadClass()
14:10:04.522 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonInclude): interface com.fasterxml.jackson.annotation.JsonInclude
14:10:04.522 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonInclude$Include)
14:10:04.522 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonInclude$Include): calling childClassLoader.findClass()
14:10:04.522 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonInclude$Include)
14:10:04.522 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonInclude$Include): calling childClassLoader().findClass() 
14:10:04.522 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonInclude$Include): calling componentClassLoader.findClass()
14:10:04.522 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonInclude$Include): calling componentClassLoader.loadClass()
14:10:04.523 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonInclude$Include): class com.fasterxml.jackson.annotation.JsonInclude$Include
14:10:04.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonIgnoreProperties)
14:10:04.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonIgnoreProperties): calling childClassLoader.findClass()
14:10:04.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonIgnoreProperties)
14:10:04.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonIgnoreProperties): calling childClassLoader().findClass() 
14:10:04.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonIgnoreProperties): calling componentClassLoader.findClass()
14:10:04.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonIgnoreProperties): calling componentClassLoader.loadClass()
14:10:04.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonIgnoreProperties): interface com.fasterxml.jackson.annotation.JsonIgnoreProperties
14:10:04.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerRole)
14:10:04.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerRole): calling childClassLoader.findClass()
14:10:04.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerRole)
14:10:04.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerRole): calling childClassLoader().findClass() 
14:10:04.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerRole): class org.apache.ranger.plugin.model.RangerRole
14:10:04.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerRole): class org.apache.ranger.plugin.model.RangerRole
14:10:04.585 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerRole$RoleMember)
14:10:04.585 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerRole$RoleMember): calling childClassLoader.findClass()
14:10:04.585 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerRole$RoleMember)
14:10:04.585 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerRole$RoleMember): calling childClassLoader().findClass() 
14:10:04.585 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerRole$RoleMember): class org.apache.ranger.plugin.model.RangerRole$RoleMember
14:10:04.586 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerRole$RoleMember): class org.apache.ranger.plugin.model.RangerRole$RoleMember
14:10:04.603 [main] DEBUG org.apache.ranger.admin.client.RangerAdminRESTClient -- <== RangerAdminRESTClient.getRolesIfUpdated(-1, 0): 
14:10:04.603 [main] DEBUG org.apache.ranger.plugin.util.RangerRolesProvider -- ==> RangerRolesProvider(serviceName=kms).saveToCache()
14:10:04.603 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils)
14:10:04.603 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils): calling childClassLoader.findClass()
14:10:04.603 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils)
14:10:04.603 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils): calling childClassLoader().findClass() 
14:10:04.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils): class org.apache.ranger.authorization.utils.JsonUtils
14:10:04.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils): class org.apache.ranger.authorization.utils.JsonUtils
14:10:04.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$1)
14:10:04.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$1): calling childClassLoader.findClass()
14:10:04.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$1)
14:10:04.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$1): calling childClassLoader().findClass() 
14:10:04.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$1): class org.apache.ranger.authorization.utils.JsonUtils$1
14:10:04.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$1): class org.apache.ranger.authorization.utils.JsonUtils$1
14:10:04.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$2)
14:10:04.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$2): calling childClassLoader.findClass()
14:10:04.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$2)
14:10:04.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$2): calling childClassLoader().findClass() 
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$2): class org.apache.ranger.authorization.utils.JsonUtils$2
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$2): class org.apache.ranger.authorization.utils.JsonUtils$2
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$3)
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$3): calling childClassLoader.findClass()
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$3)
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$3): calling childClassLoader().findClass() 
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$3): class org.apache.ranger.authorization.utils.JsonUtils$3
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$3): class org.apache.ranger.authorization.utils.JsonUtils$3
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$4)
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$4): calling childClassLoader.findClass()
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$4)
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$4): calling childClassLoader().findClass() 
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$4): class org.apache.ranger.authorization.utils.JsonUtils$4
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$4): class org.apache.ranger.authorization.utils.JsonUtils$4
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$5)
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$5): calling childClassLoader.findClass()
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$5)
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$5): calling childClassLoader().findClass() 
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$5): class org.apache.ranger.authorization.utils.JsonUtils$5
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$5): class org.apache.ranger.authorization.utils.JsonUtils$5
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$6)
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$6): calling childClassLoader.findClass()
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$6)
14:10:04.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$6): calling childClassLoader().findClass() 
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$6): class org.apache.ranger.authorization.utils.JsonUtils$6
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$6): class org.apache.ranger.authorization.utils.JsonUtils$6
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$7)
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$7): calling childClassLoader.findClass()
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$7)
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$7): calling childClassLoader().findClass() 
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$7): class org.apache.ranger.authorization.utils.JsonUtils$7
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$7): class org.apache.ranger.authorization.utils.JsonUtils$7
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$8)
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$8): calling childClassLoader.findClass()
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$8)
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$8): calling childClassLoader().findClass() 
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$8): class org.apache.ranger.authorization.utils.JsonUtils$8
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$8): class org.apache.ranger.authorization.utils.JsonUtils$8
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$9)
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$9): calling childClassLoader.findClass()
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$9)
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$9): calling childClassLoader().findClass() 
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$9): class org.apache.ranger.authorization.utils.JsonUtils$9
14:10:04.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$9): class org.apache.ranger.authorization.utils.JsonUtils$9
14:10:04.607 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$10)
14:10:04.607 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$10): calling childClassLoader.findClass()
14:10:04.607 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$10)
14:10:04.607 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$10): calling childClassLoader().findClass() 
14:10:04.607 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$10): class org.apache.ranger.authorization.utils.JsonUtils$10
14:10:04.607 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$10): class org.apache.ranger.authorization.utils.JsonUtils$10
14:10:04.607 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$11)
14:10:04.607 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$11): calling childClassLoader.findClass()
14:10:04.607 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$11)
14:10:04.607 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$11): calling childClassLoader().findClass() 
14:10:04.607 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$11): class org.apache.ranger.authorization.utils.JsonUtils$11
14:10:04.607 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$11): class org.apache.ranger.authorization.utils.JsonUtils$11
14:10:04.607 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValiditySchedule)
14:10:04.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValiditySchedule): calling childClassLoader.findClass()
14:10:04.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValiditySchedule)
14:10:04.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValiditySchedule): calling childClassLoader().findClass() 
14:10:04.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValiditySchedule): class org.apache.ranger.plugin.model.RangerValiditySchedule
14:10:04.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValiditySchedule): class org.apache.ranger.plugin.model.RangerValiditySchedule
14:10:04.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.AuditFilter)
14:10:04.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.AuditFilter): calling childClassLoader.findClass()
14:10:04.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.AuditFilter)
14:10:04.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.AuditFilter): calling childClassLoader().findClass() 
14:10:04.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.AuditFilter): class org.apache.ranger.plugin.model.AuditFilter
14:10:04.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.AuditFilter): class org.apache.ranger.plugin.model.AuditFilter
14:10:04.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence)
14:10:04.609 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence): calling childClassLoader.findClass()
14:10:04.609 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence)
14:10:04.609 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence): calling childClassLoader().findClass() 
14:10:04.609 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence): class org.apache.ranger.plugin.model.RangerValidityRecurrence
14:10:04.609 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence): class org.apache.ranger.plugin.model.RangerValidityRecurrence
14:10:04.609 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPrincipal)
14:10:04.609 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPrincipal): calling childClassLoader.findClass()
14:10:04.609 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPrincipal)
14:10:04.609 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPrincipal): calling childClassLoader().findClass() 
14:10:04.609 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPrincipal): class org.apache.ranger.plugin.model.RangerPrincipal
14:10:04.609 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPrincipal): class org.apache.ranger.plugin.model.RangerPrincipal
14:10:04.609 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemDataMaskInfo)
14:10:04.609 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemDataMaskInfo): calling childClassLoader.findClass()
14:10:04.609 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemDataMaskInfo)
14:10:04.609 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemDataMaskInfo): calling childClassLoader().findClass() 
14:10:04.610 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemDataMaskInfo): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemDataMaskInfo
14:10:04.610 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemDataMaskInfo): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemDataMaskInfo
14:10:04.610 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyResource)
14:10:04.610 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyResource): calling childClassLoader.findClass()
14:10:04.610 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyResource)
14:10:04.610 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyResource): calling childClassLoader().findClass() 
14:10:04.610 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyResource): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyResource
14:10:04.610 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyResource): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyResource
14:10:04.610 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerTag)
14:10:04.610 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerTag): calling childClassLoader.findClass()
14:10:04.610 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerTag)
14:10:04.610 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerTag): calling childClassLoader().findClass() 
14:10:04.611 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerTag): class org.apache.ranger.plugin.model.RangerTag
14:10:04.611 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerTag): class org.apache.ranger.plugin.model.RangerTag
14:10:04.611 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.core.JsonParser$Feature)
14:10:04.611 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.core.JsonParser$Feature): calling childClassLoader.findClass()
14:10:04.611 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.core.JsonParser$Feature)
14:10:04.611 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.core.JsonParser$Feature): calling childClassLoader().findClass() 
14:10:04.611 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.core.JsonParser$Feature): calling componentClassLoader.findClass()
14:10:04.611 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.core.JsonParser$Feature): calling componentClassLoader.loadClass()
14:10:04.612 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.core.JsonParser$Feature): class com.fasterxml.jackson.core.JsonParser$Feature
14:10:04.612 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.databind.DeserializationFeature)
14:10:04.612 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.databind.DeserializationFeature): calling childClassLoader.findClass()
14:10:04.612 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.databind.DeserializationFeature)
14:10:04.612 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.databind.DeserializationFeature): calling childClassLoader().findClass() 
14:10:04.612 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.databind.DeserializationFeature): calling componentClassLoader.findClass()
14:10:04.612 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.databind.DeserializationFeature): calling componentClassLoader.loadClass()
14:10:04.612 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.databind.DeserializationFeature): class com.fasterxml.jackson.databind.DeserializationFeature
14:10:04.636 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Math)
14:10:04.636 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Math): calling childClassLoader.findClass()
14:10:04.636 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Math): class java.lang.Math
14:10:04.636 [main] DEBUG org.apache.ranger.perf.policyengine.init -- [PERF]:main:RangerRolesProvider.saveToCache(serviceName=kms):32457775:32816448
14:10:04.636 [main] DEBUG org.apache.ranger.plugin.util.RangerRolesProvider -- <== RangerRolesProvider.saveToCache(serviceName=kms)
14:10:04.636 [main] INFO org.apache.ranger.plugin.util.RangerRolesProvider -- RangerRolesProvider(serviceName=kms): found updated version. lastKnownRoleVersion=-1; newVersion=1
14:10:04.636 [main] DEBUG org.apache.ranger.perf.policyengine.init -- [PERF]:main:RangerRolesProvider.loadUserGroupRolesFromAdmin(serviceName=kms):796322065:1067198569
14:10:04.636 [main] DEBUG org.apache.ranger.plugin.util.RangerRolesProvider -- <== RangerRolesProvider(serviceName=kms serviceType= kms ).loadUserGroupRolesFromAdmin()
14:10:04.636 [main] DEBUG org.apache.ranger.perf.policyengine.init -- In-Use memory: 165328536, Free memory:142952808
14:10:04.636 [main] DEBUG org.apache.ranger.perf.policyengine.init -- [PERF]:main:RangerRolesProvider.loadUserGroupRoles(serviceName=kms):796512082:1067383821
14:10:04.636 [main] DEBUG org.apache.ranger.plugin.util.RangerRolesProvider -- <== RangerRolesProvider(serviceName=kms).loadUserGroupRoles()
14:10:04.636 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- <== PolicyRefresher(serviceName=kms).loadRoles()
14:10:04.636 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- ==> PolicyRefresher(serviceName=kms).loadPolicy()
14:10:04.636 [main] DEBUG org.apache.ranger.perf.policyengine.init -- In-Use memory: 165328536, Free memory:142952808
14:10:04.636 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- ==> PolicyRefresher(serviceName=kms).loadPolicyfromPolicyAdmin()
14:10:04.636 [main] DEBUG org.apache.ranger.admin.client.RangerAdminRESTClient -- ==> RangerAdminRESTClient.getServicePoliciesIfUpdated(-1, 0)
14:10:04.637 [main] DEBUG org.apache.ranger.admin.client.RangerAdminRESTClient -- Checking Service policy if updated as user : rangerkms/127.25.254.212@KRBTEST.COM (auth:KERBEROS)
14:10:04.637 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: rangerkms/127.25.254.212@KRBTEST.COM (auth:KERBEROS)][action: org.apache.ranger.admin.client.RangerAdminRESTClient$$Lambda$162/0x00007f6bcc3529c0@22ee0d07]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.ranger.audit.provider.MiscUtil.executePrivilegedAction(MiscUtil.java:560)
	at org.apache.ranger.admin.client.RangerAdminRESTClient.getServicePoliciesIfUpdated(RangerAdminRESTClient.java:137)
	at org.apache.ranger.plugin.util.PolicyRefresher.loadPolicyfromPolicyAdmin(PolicyRefresher.java:302)
	at org.apache.ranger.plugin.util.PolicyRefresher.loadPolicy(PolicyRefresher.java:241)
	at org.apache.ranger.plugin.util.PolicyRefresher.startRefresher(PolicyRefresher.java:139)
	at org.apache.ranger.plugin.service.RangerBasePlugin.init(RangerBasePlugin.java:310)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin.init(RangerKmsAuthorizer.java:346)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.init(RangerKmsAuthorizer.java:303)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.<init>(RangerKmsAuthorizer.java:127)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.<init>(RangerKmsAuthorizer.java:153)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500)
	at java.base/java.lang.reflect.ReflectAccess.newInstance(ReflectAccess.java:128)
	at java.base/jdk.internal.reflect.ReflectionFactory.newInstance(ReflectionFactory.java:347)
	at java.base/java.lang.Class.newInstance(Class.java:647)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.init(RangerKmsAuthorizer.java:70)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.<init>(RangerKmsAuthorizer.java:50)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500)
	at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:481)
	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
	at org.apache.hadoop.crypto.key.kms.server.KMSWebApp.getKeyAcls(KMSWebApp.java:254)
	at org.apache.hadoop.crypto.key.kms.server.KMSWebApp.contextInitialized(KMSWebApp.java:143)
	at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4018)
	at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:4460)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1203)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1193)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:145)
	at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:749)
	at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:721)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1203)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1193)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:145)
	at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:749)
	at org.apache.catalina.core.StandardEngine.startInternal(StandardEngine.java:211)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.StandardService.startInternal(StandardService.java:415)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.StandardServer.startInternal(StandardServer.java:874)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.startup.Tomcat.start(Tomcat.java:438)
	at org.apache.ranger.server.tomcat.EmbeddedServer.startServer(EmbeddedServer.java:351)
	at org.apache.ranger.server.tomcat.EmbeddedServer.start(EmbeddedServer.java:317)
	at org.apache.ranger.server.tomcat.EmbeddedServer.main(EmbeddedServer.java:95)
14:10:04.638 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/jersey-client-components) 
14:10:04.638 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/jersey-client-components): calling childClassLoader.findResources()
14:10:04.638 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/jersey-client-components): calling componentClassLoader.getResources()
14:10:04.638 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/jersey-client-components): java.lang.CompoundEnumeration@177baa95
14:10:04.638 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/jersey-client-components) 
14:10:04.638 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider) 
14:10:04.638 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider): calling childClassLoader.findResources()
14:10:04.638 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider): calling componentClassLoader.getResources()
14:10:04.638 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider): java.lang.CompoundEnumeration@1266199c
14:10:04.638 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider) 
14:10:04.639 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider) 
14:10:04.639 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider): calling childClassLoader.findResources()
14:10:04.639 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider): calling componentClassLoader.getResources()
14:10:04.639 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider): java.lang.CompoundEnumeration@4821c21
14:10:04.639 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider) 
14:10:04.646 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/javax.ws.rs.ext.MessageBodyReader) 
14:10:04.646 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyReader): calling childClassLoader.findResources()
14:10:04.646 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyReader): calling componentClassLoader.getResources()
14:10:04.646 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyReader): java.lang.CompoundEnumeration@307ed189
14:10:04.646 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/javax.ws.rs.ext.MessageBodyReader) 
14:10:04.721 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/javax.ws.rs.ext.MessageBodyWriter) 
14:10:04.721 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyWriter): calling childClassLoader.findResources()
14:10:04.721 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyWriter): calling componentClassLoader.getResources()
14:10:04.721 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyWriter): java.lang.CompoundEnumeration@577d07b
14:10:04.721 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/javax.ws.rs.ext.MessageBodyWriter) 
14:10:04.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServicePolicies)
14:10:04.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServicePolicies): calling childClassLoader.findClass()
14:10:04.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServicePolicies)
14:10:04.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServicePolicies): calling childClassLoader().findClass() 
14:10:04.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServicePolicies): class org.apache.ranger.plugin.util.ServicePolicies
14:10:04.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServicePolicies): class org.apache.ranger.plugin.util.ServicePolicies
14:10:04.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef)
14:10:04.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef): calling childClassLoader.findClass()
14:10:04.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef)
14:10:04.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef): calling childClassLoader().findClass() 
14:10:04.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef): class org.apache.ranger.plugin.model.RangerServiceDef
14:10:04.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef): class org.apache.ranger.plugin.model.RangerServiceDef
14:10:04.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServicePolicies$TagPolicies)
14:10:04.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServicePolicies$TagPolicies): calling childClassLoader.findClass()
14:10:04.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServicePolicies$TagPolicies)
14:10:04.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServicePolicies$TagPolicies): calling childClassLoader().findClass() 
14:10:04.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServicePolicies$TagPolicies): class org.apache.ranger.plugin.util.ServicePolicies$TagPolicies
14:10:04.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServicePolicies$TagPolicies): class org.apache.ranger.plugin.util.ServicePolicies$TagPolicies
14:10:04.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl)
14:10:04.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl): calling childClassLoader.findClass()
14:10:04.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl)
14:10:04.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl): calling childClassLoader().findClass() 
14:10:04.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl): class org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl
14:10:04.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl): class org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl
14:10:04.819 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy)
14:10:04.819 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy): calling childClassLoader.findClass()
14:10:04.819 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy)
14:10:04.819 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy): calling childClassLoader().findClass() 
14:10:04.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy): class org.apache.ranger.plugin.model.RangerPolicy
14:10:04.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy): class org.apache.ranger.plugin.model.RangerPolicy
14:10:04.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServicePolicies$SecurityZoneInfo)
14:10:04.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServicePolicies$SecurityZoneInfo): calling childClassLoader.findClass()
14:10:04.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServicePolicies$SecurityZoneInfo)
14:10:04.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServicePolicies$SecurityZoneInfo): calling childClassLoader().findClass() 
14:10:04.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServicePolicies$SecurityZoneInfo): class org.apache.ranger.plugin.util.ServicePolicies$SecurityZoneInfo
14:10:04.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServicePolicies$SecurityZoneInfo): class org.apache.ranger.plugin.util.ServicePolicies$SecurityZoneInfo
14:10:04.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicyDelta)
14:10:04.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicyDelta): calling childClassLoader.findClass()
14:10:04.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicyDelta)
14:10:04.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicyDelta): calling childClassLoader().findClass() 
14:10:04.826 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicyDelta): class org.apache.ranger.plugin.model.RangerPolicyDelta
14:10:04.826 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicyDelta): class org.apache.ranger.plugin.model.RangerPolicyDelta
14:10:04.845 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskDef)
14:10:04.845 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskDef): calling childClassLoader.findClass()
14:10:04.845 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskDef)
14:10:04.845 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskDef): calling childClassLoader().findClass() 
14:10:04.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskDef
14:10:04.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskDef
14:10:04.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerRowFilterDef)
14:10:04.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerRowFilterDef): calling childClassLoader.findClass()
14:10:04.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerRowFilterDef)
14:10:04.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerRowFilterDef): calling childClassLoader().findClass() 
14:10:04.847 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerRowFilterDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerRowFilterDef
14:10:04.847 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerRowFilterDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerRowFilterDef
14:10:04.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerServiceConfigDef)
14:10:04.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerServiceConfigDef): calling childClassLoader.findClass()
14:10:04.855 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerServiceConfigDef)
14:10:04.855 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerServiceConfigDef): calling childClassLoader().findClass() 
14:10:04.855 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerServiceConfigDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerServiceConfigDef
14:10:04.855 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerServiceConfigDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerServiceConfigDef
14:10:04.855 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerResourceDef)
14:10:04.855 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerResourceDef): calling childClassLoader.findClass()
14:10:04.856 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerResourceDef)
14:10:04.856 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerResourceDef): calling childClassLoader().findClass() 
14:10:04.856 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerResourceDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerResourceDef
14:10:04.856 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerResourceDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerResourceDef
14:10:04.856 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef)
14:10:04.856 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef): calling childClassLoader.findClass()
14:10:04.856 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef)
14:10:04.857 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef): calling childClassLoader().findClass() 
14:10:04.857 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef
14:10:04.857 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef
14:10:04.857 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerPolicyConditionDef)
14:10:04.857 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerPolicyConditionDef): calling childClassLoader.findClass()
14:10:04.857 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerPolicyConditionDef)
14:10:04.857 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerPolicyConditionDef): calling childClassLoader().findClass() 
14:10:04.858 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerPolicyConditionDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerPolicyConditionDef
14:10:04.858 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerPolicyConditionDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerPolicyConditionDef
14:10:04.858 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerContextEnricherDef)
14:10:04.858 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerContextEnricherDef): calling childClassLoader.findClass()
14:10:04.858 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerContextEnricherDef)
14:10:04.858 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerContextEnricherDef): calling childClassLoader().findClass() 
14:10:04.858 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerContextEnricherDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerContextEnricherDef
14:10:04.858 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerContextEnricherDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerContextEnricherDef
14:10:04.858 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumDef)
14:10:04.859 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumDef): calling childClassLoader.findClass()
14:10:04.859 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumDef)
14:10:04.859 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumDef): calling childClassLoader().findClass() 
14:10:04.859 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumDef
14:10:04.859 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumDef
14:10:04.876 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskTypeDef)
14:10:04.876 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskTypeDef): calling childClassLoader.findClass()
14:10:04.876 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskTypeDef)
14:10:04.876 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskTypeDef): calling childClassLoader().findClass() 
14:10:04.877 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskTypeDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskTypeDef
14:10:04.877 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskTypeDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskTypeDef
14:10:04.881 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef$AccessTypeCategory)
14:10:04.881 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef$AccessTypeCategory): calling childClassLoader.findClass()
14:10:04.881 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef$AccessTypeCategory)
14:10:04.881 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef$AccessTypeCategory): calling childClassLoader().findClass() 
14:10:04.881 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef$AccessTypeCategory): class org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef$AccessTypeCategory
14:10:04.881 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef$AccessTypeCategory): class org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef$AccessTypeCategory
14:10:04.904 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumElementDef)
14:10:04.907 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumElementDef): calling childClassLoader.findClass()
14:10:04.907 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumElementDef)
14:10:04.907 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumElementDef): calling childClassLoader().findClass() 
14:10:04.907 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumElementDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumElementDef
14:10:04.907 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumElementDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumElementDef
14:10:04.911 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItem)
14:10:04.911 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItem): calling childClassLoader.findClass()
14:10:04.911 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItem)
14:10:04.911 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItem): calling childClassLoader().findClass() 
14:10:04.911 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItem): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItem
14:10:04.911 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItem): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItem
14:10:04.911 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerDataMaskPolicyItem)
14:10:04.911 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerDataMaskPolicyItem): calling childClassLoader.findClass()
14:10:04.911 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerDataMaskPolicyItem)
14:10:04.911 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerDataMaskPolicyItem): calling childClassLoader().findClass() 
14:10:04.912 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerDataMaskPolicyItem): class org.apache.ranger.plugin.model.RangerPolicy$RangerDataMaskPolicyItem
14:10:04.912 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerDataMaskPolicyItem): class org.apache.ranger.plugin.model.RangerPolicy$RangerDataMaskPolicyItem
14:10:04.912 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerRowFilterPolicyItem)
14:10:04.912 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerRowFilterPolicyItem): calling childClassLoader.findClass()
14:10:04.912 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerRowFilterPolicyItem)
14:10:04.912 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerRowFilterPolicyItem): calling childClassLoader().findClass() 
14:10:04.912 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerRowFilterPolicyItem): class org.apache.ranger.plugin.model.RangerPolicy$RangerRowFilterPolicyItem
14:10:04.912 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerRowFilterPolicyItem): class org.apache.ranger.plugin.model.RangerPolicy$RangerRowFilterPolicyItem
14:10:04.912 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemCondition)
14:10:04.912 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemCondition): calling childClassLoader.findClass()
14:10:04.912 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemCondition)
14:10:04.912 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemCondition): calling childClassLoader().findClass() 
14:10:04.913 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemCondition): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemCondition
14:10:04.913 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemCondition): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemCondition
14:10:04.920 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemAccess)
14:10:04.920 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemAccess): calling childClassLoader.findClass()
14:10:04.921 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemAccess)
14:10:04.921 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemAccess): calling childClassLoader().findClass() 
14:10:04.921 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemAccess): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemAccess
14:10:04.921 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemAccess): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemAccess
14:10:04.925 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemRowFilterInfo)
14:10:04.925 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemRowFilterInfo): calling childClassLoader.findClass()
14:10:04.925 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemRowFilterInfo)
14:10:04.925 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemRowFilterInfo): calling childClassLoader().findClass() 
14:10:04.926 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemRowFilterInfo): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemRowFilterInfo
14:10:04.926 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemRowFilterInfo): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemRowFilterInfo
14:10:04.933 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule)
14:10:04.935 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule): calling childClassLoader.findClass()
14:10:04.936 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule)
14:10:04.936 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule): calling childClassLoader().findClass() 
14:10:04.936 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule): class org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule
14:10:04.936 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule): class org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule
14:10:04.936 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$ValidityInterval)
14:10:04.936 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$ValidityInterval): calling childClassLoader.findClass()
14:10:04.936 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$ValidityInterval)
14:10:04.936 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$ValidityInterval): calling childClassLoader().findClass() 
14:10:04.936 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$ValidityInterval): class org.apache.ranger.plugin.model.RangerValidityRecurrence$ValidityInterval
14:10:04.936 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$ValidityInterval): class org.apache.ranger.plugin.model.RangerValidityRecurrence$ValidityInterval
14:10:04.939 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule$ScheduleFieldSpec)
14:10:04.939 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule$ScheduleFieldSpec): calling childClassLoader.findClass()
14:10:04.939 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule$ScheduleFieldSpec)
14:10:04.939 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule$ScheduleFieldSpec): calling childClassLoader().findClass() 
14:10:04.939 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule$ScheduleFieldSpec): class org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule$ScheduleFieldSpec
14:10:04.939 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule$ScheduleFieldSpec): class org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule$ScheduleFieldSpec
14:10:04.948 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonIgnore)
14:10:04.948 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonIgnore): calling childClassLoader.findClass()
14:10:04.948 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonIgnore)
14:10:04.948 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonIgnore): calling childClassLoader().findClass() 
14:10:04.948 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonIgnore): calling componentClassLoader.findClass()
14:10:04.948 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonIgnore): calling componentClassLoader.loadClass()
14:10:04.949 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonIgnore): interface com.fasterxml.jackson.annotation.JsonIgnore
14:10:04.952 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$PolicyIdComparator)
14:10:04.953 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$PolicyIdComparator): calling childClassLoader.findClass()
14:10:04.953 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$PolicyIdComparator)
14:10:04.953 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$PolicyIdComparator): calling childClassLoader().findClass() 
14:10:04.953 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$PolicyIdComparator): class org.apache.ranger.plugin.model.RangerPolicy$PolicyIdComparator
14:10:04.953 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$PolicyIdComparator): class org.apache.ranger.plugin.model.RangerPolicy$PolicyIdComparator
14:10:04.955 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.HashSet)
14:10:04.955 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.HashSet): calling childClassLoader.findClass()
14:10:04.955 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.HashSet): class java.util.HashSet
14:10:04.958 [main] DEBUG org.apache.ranger.admin.client.RangerAdminRESTClient -- <== RangerAdminRESTClient.getServicePoliciesIfUpdated(-1, 0): serviceName=kms, serviceId=3, policyVersion=4, policyUpdateTime=Mon May 04 14:09:47 UTC 2026, policies=[RangerPolicy={id={3} guid={dc09ffd7-0397-41c0-9ce9-8839b8f05e29} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}], tagPolicies=null, policyDeltas=null, serviceDef=RangerServiceDef={id={7} guid={c70ba7aa-1c2e-454e-834b-030fe061a1e0} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:09:29 UTC 2026} updateTime={Mon May 04 14:09:29 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }, auditMode=audit-default, securityZones=null
14:10:04.958 [main] INFO org.apache.ranger.plugin.util.PolicyRefresher -- PolicyRefresher(serviceName=kms): found updated version. lastKnownVersion=-1; newVersion=4
14:10:04.958 [main] DEBUG org.apache.ranger.perf.policyengine.init -- [PERF]:main:PolicyRefresher.loadPolicyFromPolicyAdmin(serviceName=kms):216813797:321665629
14:10:04.958 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- <== PolicyRefresher(serviceName=kms).loadPolicyfromPolicyAdmin()
14:10:04.958 [main] DEBUG org.apache.ranger.perf.policyengine.init -- In-Use memory: 188397208, Free memory:119884136
14:10:04.958 [main] DEBUG org.apache.ranger.plugin.service.RangerBasePlugin -- ==> setPolicies(serviceName=kms, serviceId=3, policyVersion=4, policyUpdateTime=Mon May 04 14:09:47 UTC 2026, policies=[RangerPolicy={id={3} guid={dc09ffd7-0397-41c0-9ce9-8839b8f05e29} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}], tagPolicies=null, policyDeltas=null, serviceDef=RangerServiceDef={id={7} guid={c70ba7aa-1c2e-454e-834b-030fe061a1e0} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:09:29 UTC 2026} updateTime={Mon May 04 14:09:29 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }, auditMode=audit-default, securityZones=null)
14:10:04.958 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.MapUtils)
14:10:04.958 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.MapUtils): calling childClassLoader.findClass()
14:10:04.958 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.MapUtils)
14:10:04.958 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.MapUtils): calling childClassLoader().findClass() 
14:10:04.959 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.MapUtils): class org.apache.commons.collections.MapUtils
14:10:04.959 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.MapUtils): class org.apache.commons.collections.MapUtils
14:10:04.960 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.SortedMap)
14:10:04.960 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.SortedMap): calling childClassLoader.findClass()
14:10:04.960 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.SortedMap): interface java.util.SortedMap
14:10:04.960 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.UnmodifiableMap)
14:10:04.960 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.UnmodifiableMap): calling childClassLoader.findClass()
14:10:04.960 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.map.UnmodifiableMap)
14:10:04.960 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.map.UnmodifiableMap): calling childClassLoader().findClass() 
14:10:04.960 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.IterableMap)
14:10:04.960 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.IterableMap): calling childClassLoader.findClass()
14:10:04.960 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.IterableMap)
14:10:04.960 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.IterableMap): calling childClassLoader().findClass() 
14:10:04.960 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.IterableMap): interface org.apache.commons.collections.IterableMap
14:10:04.960 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.IterableMap): interface org.apache.commons.collections.IterableMap
14:10:04.960 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.AbstractMapDecorator)
14:10:04.960 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.AbstractMapDecorator): calling childClassLoader.findClass()
14:10:04.960 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.map.AbstractMapDecorator)
14:10:04.960 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.map.AbstractMapDecorator): calling childClassLoader().findClass() 
14:10:04.961 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.map.AbstractMapDecorator): class org.apache.commons.collections.map.AbstractMapDecorator
14:10:04.961 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.AbstractMapDecorator): class org.apache.commons.collections.map.AbstractMapDecorator
14:10:04.961 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.map.UnmodifiableMap): class org.apache.commons.collections.map.UnmodifiableMap
14:10:04.961 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.UnmodifiableMap): class org.apache.commons.collections.map.UnmodifiableMap
14:10:04.961 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.MapIterator)
14:10:04.961 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.MapIterator): calling childClassLoader.findClass()
14:10:04.961 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.MapIterator)
14:10:04.961 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.MapIterator): calling childClassLoader().findClass() 
14:10:04.961 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.MapIterator): interface org.apache.commons.collections.MapIterator
14:10:04.961 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.MapIterator): interface org.apache.commons.collections.MapIterator
14:10:04.962 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.TreeMap)
14:10:04.962 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.TreeMap): calling childClassLoader.findClass()
14:10:04.962 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.TreeMap): class java.util.TreeMap
14:10:04.962 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.UnmodifiableSortedMap)
14:10:04.962 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.UnmodifiableSortedMap): calling childClassLoader.findClass()
14:10:04.962 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.map.UnmodifiableSortedMap)
14:10:04.962 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.map.UnmodifiableSortedMap): calling childClassLoader().findClass() 
14:10:04.962 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.AbstractSortedMapDecorator)
14:10:04.962 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.AbstractSortedMapDecorator): calling childClassLoader.findClass()
14:10:04.962 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.map.AbstractSortedMapDecorator)
14:10:04.962 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.map.AbstractSortedMapDecorator): calling childClassLoader().findClass() 
14:10:04.962 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.map.AbstractSortedMapDecorator): class org.apache.commons.collections.map.AbstractSortedMapDecorator
14:10:04.962 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.AbstractSortedMapDecorator): class org.apache.commons.collections.map.AbstractSortedMapDecorator
14:10:04.962 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.map.UnmodifiableSortedMap): class org.apache.commons.collections.map.UnmodifiableSortedMap
14:10:04.962 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.UnmodifiableSortedMap): class org.apache.commons.collections.map.UnmodifiableSortedMap
14:10:04.963 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPolicyDeltaUtil)
14:10:04.963 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPolicyDeltaUtil): calling childClassLoader.findClass()
14:10:04.963 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPolicyDeltaUtil)
14:10:04.963 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPolicyDeltaUtil): calling childClassLoader().findClass() 
14:10:04.963 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPolicyDeltaUtil): class org.apache.ranger.plugin.util.RangerPolicyDeltaUtil
14:10:04.963 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPolicyDeltaUtil): class org.apache.ranger.plugin.util.RangerPolicyDeltaUtil
14:10:04.964 [main] DEBUG org.apache.ranger.plugin.util.RangerPolicyDeltaUtil -- ==> hasPolicyDeltas(servicePolicies:[serviceName=kms, serviceId=3, policyVersion=4, policyUpdateTime=Mon May 04 14:09:47 UTC 2026, policies=[RangerPolicy={id={3} guid={dc09ffd7-0397-41c0-9ce9-8839b8f05e29} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}], tagPolicies=null, policyDeltas=null, serviceDef=RangerServiceDef={id={7} guid={c70ba7aa-1c2e-454e-834b-030fe061a1e0} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:09:29 UTC 2026} updateTime={Mon May 04 14:09:29 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }, auditMode=audit-default, securityZones=null]
14:10:04.964 [main] DEBUG org.apache.ranger.plugin.util.RangerPolicyDeltaUtil -- <== hasPolicyDeltas(servicePolicies:[serviceName=kms, serviceId=3, policyVersion=4, policyUpdateTime=Mon May 04 14:09:47 UTC 2026, policies=[RangerPolicy={id={3} guid={dc09ffd7-0397-41c0-9ce9-8839b8f05e29} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}], tagPolicies=null, policyDeltas=null, serviceDef=RangerServiceDef={id={7} guid={c70ba7aa-1c2e-454e-834b-030fe061a1e0} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:09:29 UTC 2026} updateTime={Mon May 04 14:09:29 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }, auditMode=audit-default, securityZones=null], ret:[false]
14:10:04.964 [main] DEBUG org.apache.ranger.plugin.service.RangerBasePlugin -- Creating engine from policies
14:10:04.964 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestProcessor)
14:10:04.964 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestProcessor): calling childClassLoader.findClass()
14:10:04.964 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestProcessor)
14:10:04.964 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestProcessor): calling childClassLoader().findClass() 
14:10:04.965 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestProcessor): interface org.apache.ranger.plugin.policyengine.RangerAccessRequestProcessor
14:10:04.965 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestProcessor): interface org.apache.ranger.plugin.policyengine.RangerAccessRequestProcessor
14:10:04.966 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.PolicyEngine)
14:10:04.966 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.PolicyEngine): calling childClassLoader.findClass()
14:10:04.966 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.PolicyEngine)
14:10:04.966 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.PolicyEngine): calling childClassLoader().findClass() 
14:10:04.967 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.PolicyEngine): class org.apache.ranger.plugin.policyengine.PolicyEngine
14:10:04.967 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.PolicyEngine): class org.apache.ranger.plugin.policyengine.PolicyEngine
14:10:04.967 [main] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- ==> PolicyEngine(, serviceName=kms, serviceId=3, policyVersion=4, policyUpdateTime=Mon May 04 14:09:47 UTC 2026, policies=[RangerPolicy={id={3} guid={dc09ffd7-0397-41c0-9ce9-8839b8f05e29} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}], tagPolicies=null, policyDeltas=null, serviceDef=RangerServiceDef={id={7} guid={c70ba7aa-1c2e-454e-834b-030fe061a1e0} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:09:29 UTC 2026} updateTime={Mon May 04 14:09:29 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }, auditMode=audit-default, securityZones=null, org.apache.ranger.plugin.policyengine.RangerPluginContext@2f80313f)
14:10:04.967 [main] DEBUG org.apache.ranger.perf.policyengine.init -- In-Use memory: 189445784, Free memory:118835560
14:10:04.967 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServiceDefUtil)
14:10:04.967 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServiceDefUtil): calling childClassLoader.findClass()
14:10:04.968 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServiceDefUtil)
14:10:04.968 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServiceDefUtil): calling childClassLoader().findClass() 
14:10:04.968 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServiceDefUtil): class org.apache.ranger.plugin.util.ServiceDefUtil
14:10:04.968 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServiceDefUtil): class org.apache.ranger.plugin.util.ServiceDefUtil
14:10:04.969 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator)
14:10:04.969 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator): calling childClassLoader.findClass()
14:10:04.969 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator)
14:10:04.969 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator): calling childClassLoader().findClass() 
14:10:04.969 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.conditionevaluator.RangerAbstractConditionEvaluator)
14:10:04.969 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.conditionevaluator.RangerAbstractConditionEvaluator): calling childClassLoader.findClass()
14:10:04.969 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.conditionevaluator.RangerAbstractConditionEvaluator)
14:10:04.969 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.conditionevaluator.RangerAbstractConditionEvaluator): calling childClassLoader().findClass() 
14:10:04.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.conditionevaluator.RangerConditionEvaluator)
14:10:04.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.conditionevaluator.RangerConditionEvaluator): calling childClassLoader.findClass()
14:10:04.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.conditionevaluator.RangerConditionEvaluator)
14:10:04.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.conditionevaluator.RangerConditionEvaluator): calling childClassLoader().findClass() 
14:10:04.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.conditionevaluator.RangerConditionEvaluator): interface org.apache.ranger.plugin.conditionevaluator.RangerConditionEvaluator
14:10:04.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.conditionevaluator.RangerConditionEvaluator): interface org.apache.ranger.plugin.conditionevaluator.RangerConditionEvaluator
14:10:04.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.conditionevaluator.RangerAbstractConditionEvaluator): class org.apache.ranger.plugin.conditionevaluator.RangerAbstractConditionEvaluator
14:10:04.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.conditionevaluator.RangerAbstractConditionEvaluator): class org.apache.ranger.plugin.conditionevaluator.RangerAbstractConditionEvaluator
14:10:04.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator): class org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator
14:10:04.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator): class org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator
14:10:04.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.contextenricher.RangerUserStoreEnricher)
14:10:04.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.contextenricher.RangerUserStoreEnricher): calling childClassLoader.findClass()
14:10:04.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.contextenricher.RangerUserStoreEnricher)
14:10:04.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.contextenricher.RangerUserStoreEnricher): calling childClassLoader().findClass() 
14:10:04.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.contextenricher.RangerAbstractContextEnricher)
14:10:04.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.contextenricher.RangerAbstractContextEnricher): calling childClassLoader.findClass()
14:10:04.970 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.contextenricher.RangerAbstractContextEnricher)
14:10:04.971 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.contextenricher.RangerAbstractContextEnricher): calling childClassLoader().findClass() 
14:10:04.971 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.contextenricher.RangerContextEnricher)
14:10:04.971 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.contextenricher.RangerContextEnricher): calling childClassLoader.findClass()
14:10:04.971 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.contextenricher.RangerContextEnricher)
14:10:04.971 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.contextenricher.RangerContextEnricher): calling childClassLoader().findClass() 
14:10:04.971 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.contextenricher.RangerContextEnricher): interface org.apache.ranger.plugin.contextenricher.RangerContextEnricher
14:10:04.971 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.contextenricher.RangerContextEnricher): interface org.apache.ranger.plugin.contextenricher.RangerContextEnricher
14:10:04.971 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.contextenricher.RangerAbstractContextEnricher): class org.apache.ranger.plugin.contextenricher.RangerAbstractContextEnricher
14:10:04.971 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.contextenricher.RangerAbstractContextEnricher): class org.apache.ranger.plugin.contextenricher.RangerAbstractContextEnricher
14:10:04.971 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.contextenricher.RangerUserStoreEnricher): class org.apache.ranger.plugin.contextenricher.RangerUserStoreEnricher
14:10:04.971 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.contextenricher.RangerUserStoreEnricher): class org.apache.ranger.plugin.contextenricher.RangerUserStoreEnricher
14:10:04.971 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPluginContext -- ==> cleanResourceMatchers()
14:10:04.971 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.locks.ReentrantReadWriteLock$WriteLock)
14:10:04.971 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.locks.ReentrantReadWriteLock$WriteLock): calling childClassLoader.findClass()
14:10:04.971 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.locks.ReentrantReadWriteLock$WriteLock): class java.util.concurrent.locks.ReentrantReadWriteLock$WriteLock
14:10:04.971 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPluginContext -- <== cleanResourceMatchers()
14:10:04.971 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerReadWriteLock)
14:10:04.971 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerReadWriteLock): calling childClassLoader.findClass()
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerReadWriteLock)
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerReadWriteLock): calling childClassLoader().findClass() 
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerReadWriteLock): class org.apache.ranger.plugin.util.RangerReadWriteLock
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerReadWriteLock): class org.apache.ranger.plugin.util.RangerReadWriteLock
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.locks.Lock)
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.locks.Lock): calling childClassLoader.findClass()
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.locks.Lock): interface java.util.concurrent.locks.Lock
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerReadWriteLock$RangerLock)
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerReadWriteLock$RangerLock): calling childClassLoader.findClass()
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerReadWriteLock$RangerLock)
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerReadWriteLock$RangerLock): calling childClassLoader().findClass() 
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.AutoCloseable)
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.AutoCloseable): calling childClassLoader.findClass()
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.AutoCloseable): interface java.lang.AutoCloseable
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerReadWriteLock$RangerLock): class org.apache.ranger.plugin.util.RangerReadWriteLock$RangerLock
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerReadWriteLock$RangerLock): class org.apache.ranger.plugin.util.RangerReadWriteLock$RangerLock
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher)
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher): calling childClassLoader.findClass()
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher)
14:10:04.972 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher): calling childClassLoader().findClass() 
14:10:04.973 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher): class org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher
14:10:04.973 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher): class org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher
14:10:04.973 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.buildZoneTrie()
14:10:04.973 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.buildZoneTrie()
14:10:04.973 [main] DEBUG org.apache.ranger.plugin.util.RangerPolicyDeltaUtil -- ==> hasPolicyDeltas(servicePolicies:[serviceName=kms, serviceId=3, policyVersion=4, policyUpdateTime=Mon May 04 14:09:47 UTC 2026, policies=[RangerPolicy={id={3} guid={dc09ffd7-0397-41c0-9ce9-8839b8f05e29} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}], tagPolicies=null, policyDeltas=null, serviceDef=RangerServiceDef={id={7} guid={c70ba7aa-1c2e-454e-834b-030fe061a1e0} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:09:29 UTC 2026} updateTime={Mon May 04 14:09:29 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }, auditMode=audit-default, securityZones=null]
14:10:04.973 [main] DEBUG org.apache.ranger.plugin.util.RangerPolicyDeltaUtil -- <== hasPolicyDeltas(servicePolicies:[serviceName=kms, serviceId=3, policyVersion=4, policyUpdateTime=Mon May 04 14:09:47 UTC 2026, policies=[RangerPolicy={id={3} guid={dc09ffd7-0397-41c0-9ce9-8839b8f05e29} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}], tagPolicies=null, policyDeltas=null, serviceDef=RangerServiceDef={id={7} guid={c70ba7aa-1c2e-454e-834b-030fe061a1e0} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:09:29 UTC 2026} updateTime={Mon May 04 14:09:29 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }, auditMode=audit-default, securityZones=null], ret:[false]
14:10:04.973 [main] INFO org.apache.ranger.plugin.policyengine.PolicyEngine -- Policy engine will not perform in place update while processing policies.
14:10:04.973 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.service.RangerAuthContext)
14:10:04.973 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.service.RangerAuthContext): calling childClassLoader.findClass()
14:10:04.973 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.service.RangerAuthContext)
14:10:04.973 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.service.RangerAuthContext): calling childClassLoader().findClass() 
14:10:04.974 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.service.RangerAuthContext): class org.apache.ranger.plugin.service.RangerAuthContext
14:10:04.974 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.service.RangerAuthContext): class org.apache.ranger.plugin.service.RangerAuthContext
14:10:04.974 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRolesUtil)
14:10:04.974 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRolesUtil): calling childClassLoader.findClass()
14:10:04.974 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRolesUtil)
14:10:04.974 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRolesUtil): calling childClassLoader().findClass() 
14:10:04.974 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRolesUtil): class org.apache.ranger.plugin.util.RangerRolesUtil
14:10:04.974 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRolesUtil): class org.apache.ranger.plugin.util.RangerRolesUtil
14:10:04.975 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerUserStoreUtil)
14:10:04.975 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerUserStoreUtil): calling childClassLoader.findClass()
14:10:04.975 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerUserStoreUtil)
14:10:04.975 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerUserStoreUtil): calling childClassLoader().findClass() 
14:10:04.975 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerUserStoreUtil): class org.apache.ranger.plugin.util.RangerUserStoreUtil
14:10:04.975 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerUserStoreUtil): class org.apache.ranger.plugin.util.RangerUserStoreUtil
14:10:04.975 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository)
14:10:04.975 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository): calling childClassLoader.findClass()
14:10:04.975 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository)
14:10:04.975 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository): calling childClassLoader().findClass() 
14:10:04.976 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository): class org.apache.ranger.plugin.policyengine.RangerPolicyRepository
14:10:04.976 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository): class org.apache.ranger.plugin.policyengine.RangerPolicyRepository
14:10:04.977 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicy)
14:10:04.977 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicy): calling childClassLoader.findClass()
14:10:04.977 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicy)
14:10:04.977 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicy): calling childClassLoader().findClass() 
14:10:04.977 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicy): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicy
14:10:04.977 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicy): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicy
14:10:04.977 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator)
14:10:04.977 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator): calling childClassLoader.findClass()
14:10:04.977 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator)
14:10:04.977 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator): calling childClassLoader().findClass() 
14:10:04.978 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator)
14:10:04.978 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator): calling childClassLoader.findClass()
14:10:04.978 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator)
14:10:04.978 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator): calling childClassLoader().findClass() 
14:10:04.979 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator
14:10:04.979 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator
14:10:04.979 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator
14:10:04.979 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator
14:10:04.979 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerCachedPolicyEvaluator)
14:10:04.979 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerCachedPolicyEvaluator): calling childClassLoader.findClass()
14:10:04.979 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerCachedPolicyEvaluator)
14:10:04.979 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerCachedPolicyEvaluator): calling childClassLoader().findClass() 
14:10:04.979 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator)
14:10:04.979 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator): calling childClassLoader.findClass()
14:10:04.979 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator)
14:10:04.979 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator): calling childClassLoader().findClass() 
14:10:04.979 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator)
14:10:04.980 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator): calling childClassLoader.findClass()
14:10:04.980 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator)
14:10:04.980 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator): calling childClassLoader().findClass() 
14:10:04.981 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator
14:10:04.981 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator
14:10:04.981 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator
14:10:04.981 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator
14:10:04.981 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerCachedPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerCachedPolicyEvaluator
14:10:04.981 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerCachedPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerCachedPolicyEvaluator
14:10:04.981 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerResourceEvaluator)
14:10:04.981 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerResourceEvaluator): calling childClassLoader.findClass()
14:10:04.981 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerResourceEvaluator)
14:10:04.981 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerResourceEvaluator): calling childClassLoader().findClass() 
14:10:04.981 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerResourceEvaluator): interface org.apache.ranger.plugin.policyresourcematcher.RangerResourceEvaluator
14:10:04.981 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerResourceEvaluator): interface org.apache.ranger.plugin.policyresourcematcher.RangerResourceEvaluator
14:10:04.982 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository$AuditModeEnum)
14:10:04.982 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository$AuditModeEnum): calling childClassLoader.findClass()
14:10:04.982 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository$AuditModeEnum)
14:10:04.982 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository$AuditModeEnum): calling childClassLoader().findClass() 
14:10:04.982 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository$AuditModeEnum): class org.apache.ranger.plugin.policyengine.RangerPolicyRepository$AuditModeEnum
14:10:04.982 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository$AuditModeEnum): class org.apache.ranger.plugin.policyengine.RangerPolicyRepository$AuditModeEnum
14:10:04.982 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- RangerPolicyRepository : building policy-repository for service[kms], and zone:[null] with auditMode[AUDIT_DEFAULT]
14:10:04.982 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper)
14:10:04.982 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper): calling childClassLoader.findClass()
14:10:04.982 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper)
14:10:04.982 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper): calling childClassLoader().findClass() 
14:10:04.983 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper): class org.apache.ranger.plugin.model.validation.RangerServiceDefHelper
14:10:04.983 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper): class org.apache.ranger.plugin.model.validation.RangerServiceDefHelper
14:10:04.983 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> RangerServiceDefHelper(). The RangerServiceDef: RangerServiceDef={id={7} guid={c70ba7aa-1c2e-454e-834b-030fe061a1e0} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:09:29 UTC 2026} updateTime={Mon May 04 14:09:29 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }
14:10:04.983 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate)
14:10:04.983 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate): calling childClassLoader.findClass()
14:10:04.983 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate)
14:10:04.983 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate): calling childClassLoader().findClass() 
14:10:04.984 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate): class org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate
14:10:04.984 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate): class org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate
14:10:04.984 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$DirectedGraph)
14:10:04.984 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$DirectedGraph): calling childClassLoader.findClass()
14:10:04.984 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$DirectedGraph)
14:10:04.984 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$DirectedGraph): calling childClassLoader().findClass() 
14:10:04.985 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$DirectedGraph): class org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$DirectedGraph
14:10:04.985 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$DirectedGraph): class org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$DirectedGraph
14:10:04.985 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Objects)
14:10:04.985 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Objects): calling childClassLoader.findClass()
14:10:04.985 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Objects): class java.util.Objects
14:10:04.985 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Created graph for resources: _nodes={keyname=[]}
14:10:04.985 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Returning sources: [keyname]
14:10:04.985 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Returning sinks: [keyname]
14:10:04.985 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Returning sources: [keyname]
14:10:04.985 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Returning sinks: [keyname]
14:10:04.985 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.thirdparty.com.google.common.collect.Lists)
14:10:04.985 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.thirdparty.com.google.common.collect.Lists): calling childClassLoader.findClass()
14:10:04.985 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.thirdparty.com.google.common.collect.Lists)
14:10:04.985 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.thirdparty.com.google.common.collect.Lists): calling childClassLoader().findClass() 
14:10:04.985 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.thirdparty.com.google.common.collect.Lists): calling componentClassLoader.findClass()
14:10:04.985 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.thirdparty.com.google.common.collect.Lists): calling componentClassLoader.loadClass()
14:10:04.985 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.thirdparty.com.google.common.collect.Lists): class org.apache.hadoop.thirdparty.com.google.common.collect.Lists
14:10:04.989 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Created graph for resources: null
14:10:04.989 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Created graph for resources: null
14:10:04.989 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.function.Function)
14:10:04.989 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.function.Function): calling childClassLoader.findClass()
14:10:04.989 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.function.Function): interface java.util.function.Function
14:10:04.990 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.stream.Stream)
14:10:04.990 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.stream.Stream): calling childClassLoader.findClass()
14:10:04.990 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.stream.Stream): interface java.util.stream.Stream
14:10:04.990 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.stream.Collectors)
14:10:04.990 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.stream.Collectors): calling childClassLoader.findClass()
14:10:04.990 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.stream.Collectors): class java.util.stream.Collectors
14:10:04.990 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate$ResourceNameLevel)
14:10:04.990 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate$ResourceNameLevel): calling childClassLoader.findClass()
14:10:04.990 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate$ResourceNameLevel)
14:10:04.991 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate$ResourceNameLevel): calling childClassLoader().findClass() 
14:10:04.991 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate$ResourceNameLevel): class org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate$ResourceNameLevel
14:10:04.991 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate$ResourceNameLevel): class org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate$ResourceNameLevel
14:10:04.991 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Found [3] resource hierarchies for service [kms] update-date[Mon May 04 14:09:29 UTC 2026]: {0=[[RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]], 1=[], 2=[]}
14:10:04.991 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.buildPolicyEvaluator(RangerPolicy={id={3} guid={dc09ffd7-0397-41c0-9ce9-8839b8f05e29} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }},RangerServiceDef={id={7} guid={c70ba7aa-1c2e-454e-834b-030fe061a1e0} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:09:29 UTC 2026} updateTime={Mon May 04 14:09:29 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }, PolicyEngineOptions: { evaluatorType: auto, evaluateDelegateAdminOnly: false, disableContextEnrichers: false, disableCustomConditions: false, disableTagPolicyEvaluation: false, disablePolicyRefresher: false, disableTagRetriever: false, disableUserStoreRetriever: false, enableTagEnricherWithLocalRefresher: false, enableUserStoreEnricherWithLocalRefresher: false, disableTrieLookupPrefilter: false, optimizeTrieForRetrieval: false, cacheAuditResult: false, disableRoleResolution: true, optimizeTrieForSpace: false, optimizeTagTrieForRetrieval: false, optimizeTagTrieForSpace: false, enableResourceMatcherReuse: true })
14:10:04.991 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.scrubPolicy(RangerPolicy={id={3} guid={dc09ffd7-0397-41c0-9ce9-8839b8f05e29} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }})
14:10:04.992 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.scrubPolicyItems(3): 
14:10:04.992 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.scrubPolicyItems(3): 
14:10:04.992 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.scrubPolicyItems(3): 
14:10:04.992 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.scrubPolicyItems(3): 
14:10:04.992 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.scrubPolicyItems(3): 
14:10:04.992 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.scrubPolicyItems(3): 
14:10:04.992 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.scrubPolicyItems(3): 
14:10:04.992 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.scrubPolicyItems(3): 
14:10:04.992 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.scrubPolicyItems(3): 
14:10:04.992 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.scrubPolicyItems(3): 
14:10:04.992 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.scrubPolicyItems(3): 
14:10:04.992 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.scrubPolicyItems(3): 
14:10:04.992 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.scrubPolicy(RangerPolicy={id={3} guid={dc09ffd7-0397-41c0-9ce9-8839b8f05e29} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}): false
14:10:04.993 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator)
14:10:04.993 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator): calling childClassLoader.findClass()
14:10:04.993 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator)
14:10:04.994 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator): calling childClassLoader().findClass() 
14:10:04.994 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator
14:10:04.994 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator
14:10:04.994 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerDataMaskPolicyItemEvaluator)
14:10:04.994 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerDataMaskPolicyItemEvaluator): calling childClassLoader.findClass()
14:10:04.994 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerDataMaskPolicyItemEvaluator)
14:10:04.994 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerDataMaskPolicyItemEvaluator): calling childClassLoader().findClass() 
14:10:04.994 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerDataMaskPolicyItemEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerDataMaskPolicyItemEvaluator
14:10:04.994 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerDataMaskPolicyItemEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerDataMaskPolicyItemEvaluator
14:10:04.995 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerRowFilterPolicyItemEvaluator)
14:10:04.995 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerRowFilterPolicyItemEvaluator): calling childClassLoader.findClass()
14:10:04.995 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerRowFilterPolicyItemEvaluator)
14:10:04.995 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerRowFilterPolicyItemEvaluator): calling childClassLoader().findClass() 
14:10:04.995 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerRowFilterPolicyItemEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerRowFilterPolicyItemEvaluator
14:10:04.995 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerRowFilterPolicyItemEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerRowFilterPolicyItemEvaluator
14:10:04.996 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyEvalOrderComparator)
14:10:04.996 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyEvalOrderComparator): calling childClassLoader.findClass()
14:10:04.996 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyEvalOrderComparator)
14:10:04.996 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyEvalOrderComparator): calling childClassLoader().findClass() 
14:10:04.996 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyEvalOrderComparator): class org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyEvalOrderComparator
14:10:04.996 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyEvalOrderComparator): class org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyEvalOrderComparator
14:10:04.996 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyNameComparator)
14:10:04.996 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyNameComparator): calling childClassLoader.findClass()
14:10:04.996 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyNameComparator)
14:10:04.996 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyNameComparator): calling childClassLoader().findClass() 
14:10:04.996 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyNameComparator): class org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyNameComparator
14:10:04.997 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyNameComparator): class org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyNameComparator
14:10:04.997 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$1)
14:10:04.997 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$1): calling childClassLoader.findClass()
14:10:04.997 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$1)
14:10:04.997 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$1): calling childClassLoader().findClass() 
14:10:04.997 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$1): class org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$1
14:10:04.997 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$1): class org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$1
14:10:04.998 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator -- ==> RangerOptimizedPolicyEvaluator.init()
14:10:04.998 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.init()
14:10:04.999 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- ==> RangerAbstractPolicyEvaluator.init(RangerPolicy={id={3} guid={dc09ffd7-0397-41c0-9ce9-8839b8f05e29} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}, RangerServiceDef={id={7} guid={c70ba7aa-1c2e-454e-834b-030fe061a1e0} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:09:29 UTC 2026} updateTime={Mon May 04 14:09:29 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} })
14:10:04.999 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- ==> RangerAbstractPolicyEvaluator.getPrunedPolicy(RangerPolicy={id={3} guid={dc09ffd7-0397-41c0-9ce9-8839b8f05e29} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }})
14:10:04.999 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- <== RangerAbstractPolicyEvaluator.getPrunedPolicy(isPruningNeeded=false) : RangerPolicy={id={3} guid={dc09ffd7-0397-41c0-9ce9-8839b8f05e29} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}
14:10:04.999 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator)
14:10:04.999 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator): calling childClassLoader.findClass()
14:10:04.999 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator)
14:10:04.999 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator): calling childClassLoader().findClass() 
14:10:05.000 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$RangerPolicyResourceEvaluator)
14:10:05.000 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$RangerPolicyResourceEvaluator): calling childClassLoader.findClass()
14:10:05.000 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$RangerPolicyResourceEvaluator)
14:10:05.000 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$RangerPolicyResourceEvaluator): calling childClassLoader().findClass() 
14:10:05.000 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$RangerPolicyResourceEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$RangerPolicyResourceEvaluator
14:10:05.000 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$RangerPolicyResourceEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$RangerPolicyResourceEvaluator
14:10:05.000 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator
14:10:05.000 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator
14:10:05.000 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher)
14:10:05.000 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher): calling childClassLoader.findClass()
14:10:05.000 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher)
14:10:05.000 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher): calling childClassLoader().findClass() 
14:10:05.001 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher): interface org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher
14:10:05.001 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher): interface org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher
14:10:05.001 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher)
14:10:05.001 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher): calling childClassLoader.findClass()
14:10:05.001 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher)
14:10:05.001 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher): calling childClassLoader().findClass() 
14:10:05.002 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher): class org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher
14:10:05.002 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher): class org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher
14:10:05.002 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.RangerResourceMatcher)
14:10:05.002 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.RangerResourceMatcher): calling childClassLoader.findClass()
14:10:05.002 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.RangerResourceMatcher)
14:10:05.002 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.RangerResourceMatcher): calling childClassLoader().findClass() 
14:10:05.002 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.RangerResourceMatcher): interface org.apache.ranger.plugin.resourcematcher.RangerResourceMatcher
14:10:05.002 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.RangerResourceMatcher): interface org.apache.ranger.plugin.resourcematcher.RangerResourceMatcher
14:10:05.003 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.init()
14:10:05.004 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> getResourceHierarchies(policyType=0, keys=keyname)
14:10:05.005 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:10:05.005 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:10:05.005 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== getResourceHierarchies(policyType=0, keys=keyname) : [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:10:05.005 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:10:05.005 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:10:05.005 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.createResourceMatcher(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} })
14:10:05.005 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPluginContext -- ==> getResourceMatcher(resourceDefName=keyname, resource=RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} })
14:10:05.005 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.locks.ReentrantReadWriteLock$ReadLock)
14:10:05.005 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.locks.ReentrantReadWriteLock$ReadLock): calling childClassLoader.findClass()
14:10:05.005 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.locks.ReentrantReadWriteLock$ReadLock): class java.util.concurrent.locks.ReentrantReadWriteLock$ReadLock
14:10:05.005 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPluginContext -- <== getResourceMatcher(resourceDefName=keyname, resource=RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }) : ret=null
14:10:05.005 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher)
14:10:05.005 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher): calling childClassLoader.findClass()
14:10:05.005 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher)
14:10:05.005 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher): calling childClassLoader().findClass() 
14:10:05.006 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher)
14:10:05.006 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher): calling childClassLoader.findClass()
14:10:05.006 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher)
14:10:05.006 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher): calling childClassLoader().findClass() 
14:10:05.006 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher): class org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher
14:10:05.006 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher): class org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher
14:10:05.006 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher): class org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher
14:10:05.006 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher): class org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher
14:10:05.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher)
14:10:05.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher): calling childClassLoader.findClass()
14:10:05.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher)
14:10:05.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher): calling childClassLoader().findClass() 
14:10:05.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher): class org.apache.ranger.plugin.resourcematcher.ResourceMatcher
14:10:05.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher): class org.apache.ranger.plugin.resourcematcher.ResourceMatcher
14:10:05.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveWildcardMatcher)
14:10:05.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveWildcardMatcher): calling childClassLoader.findClass()
14:10:05.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveWildcardMatcher)
14:10:05.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveWildcardMatcher): calling childClassLoader().findClass() 
14:10:05.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.AbstractStringResourceMatcher)
14:10:05.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.AbstractStringResourceMatcher): calling childClassLoader.findClass()
14:10:05.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.AbstractStringResourceMatcher)
14:10:05.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.AbstractStringResourceMatcher): calling childClassLoader().findClass() 
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.AbstractStringResourceMatcher): class org.apache.ranger.plugin.resourcematcher.AbstractStringResourceMatcher
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.AbstractStringResourceMatcher): class org.apache.ranger.plugin.resourcematcher.AbstractStringResourceMatcher
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveWildcardMatcher): class org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveWildcardMatcher
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveWildcardMatcher): class org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveWildcardMatcher
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveWildcardMatcher)
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveWildcardMatcher): calling childClassLoader.findClass()
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveWildcardMatcher)
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveWildcardMatcher): calling childClassLoader().findClass() 
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveWildcardMatcher): class org.apache.ranger.plugin.resourcematcher.CaseInsensitiveWildcardMatcher
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveWildcardMatcher): class org.apache.ranger.plugin.resourcematcher.CaseInsensitiveWildcardMatcher
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveWildcardMatcher)
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveWildcardMatcher): calling childClassLoader.findClass()
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveWildcardMatcher)
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveWildcardMatcher): calling childClassLoader().findClass() 
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveWildcardMatcher): class org.apache.ranger.plugin.resourcematcher.CaseSensitiveWildcardMatcher
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveWildcardMatcher): class org.apache.ranger.plugin.resourcematcher.CaseSensitiveWildcardMatcher
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStringMatcher)
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStringMatcher): calling childClassLoader.findClass()
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStringMatcher)
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStringMatcher): calling childClassLoader().findClass() 
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStringMatcher): class org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStringMatcher
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStringMatcher): class org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStringMatcher
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStringMatcher)
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStringMatcher): calling childClassLoader.findClass()
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStringMatcher)
14:10:05.008 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStringMatcher): calling childClassLoader().findClass() 
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStringMatcher): class org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStringMatcher
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStringMatcher): class org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStringMatcher
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStringMatcher)
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStringMatcher): calling childClassLoader.findClass()
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStringMatcher)
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStringMatcher): calling childClassLoader().findClass() 
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStringMatcher): class org.apache.ranger.plugin.resourcematcher.CaseSensitiveStringMatcher
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStringMatcher): class org.apache.ranger.plugin.resourcematcher.CaseSensitiveStringMatcher
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveEndsWithMatcher)
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveEndsWithMatcher): calling childClassLoader.findClass()
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveEndsWithMatcher)
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveEndsWithMatcher): calling childClassLoader().findClass() 
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveEndsWithMatcher): class org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveEndsWithMatcher
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveEndsWithMatcher): class org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveEndsWithMatcher
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveEndsWithMatcher)
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveEndsWithMatcher): calling childClassLoader.findClass()
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveEndsWithMatcher)
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveEndsWithMatcher): calling childClassLoader().findClass() 
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveEndsWithMatcher): class org.apache.ranger.plugin.resourcematcher.CaseInsensitiveEndsWithMatcher
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveEndsWithMatcher): class org.apache.ranger.plugin.resourcematcher.CaseInsensitiveEndsWithMatcher
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveEndsWithMatcher)
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveEndsWithMatcher): calling childClassLoader.findClass()
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveEndsWithMatcher)
14:10:05.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveEndsWithMatcher): calling childClassLoader().findClass() 
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveEndsWithMatcher): class org.apache.ranger.plugin.resourcematcher.CaseSensitiveEndsWithMatcher
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveEndsWithMatcher): class org.apache.ranger.plugin.resourcematcher.CaseSensitiveEndsWithMatcher
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStartsWithMatcher)
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStartsWithMatcher): calling childClassLoader.findClass()
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStartsWithMatcher)
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStartsWithMatcher): calling childClassLoader().findClass() 
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStartsWithMatcher): class org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStartsWithMatcher
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStartsWithMatcher): class org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStartsWithMatcher
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStartsWithMatcher)
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStartsWithMatcher): calling childClassLoader.findClass()
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStartsWithMatcher)
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStartsWithMatcher): calling childClassLoader().findClass() 
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStartsWithMatcher): class org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStartsWithMatcher
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStartsWithMatcher): class org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStartsWithMatcher
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStartsWithMatcher)
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStartsWithMatcher): calling childClassLoader.findClass()
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStartsWithMatcher)
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStartsWithMatcher): calling childClassLoader().findClass() 
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStartsWithMatcher): class org.apache.ranger.plugin.resourcematcher.CaseSensitiveStartsWithMatcher
14:10:05.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStartsWithMatcher): class org.apache.ranger.plugin.resourcematcher.CaseSensitiveStartsWithMatcher
14:10:05.011 [main] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- ==> RangerAbstractResourceMatcher.init()
14:10:05.011 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRequestExprResolver)
14:10:05.011 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRequestExprResolver): calling childClassLoader.findClass()
14:10:05.011 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRequestExprResolver)
14:10:05.011 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRequestExprResolver): calling childClassLoader().findClass() 
14:10:05.011 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRequestExprResolver): class org.apache.ranger.plugin.util.RangerRequestExprResolver
14:10:05.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRequestExprResolver): class org.apache.ranger.plugin.util.RangerRequestExprResolver
14:10:05.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.regex.Matcher)
14:10:05.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.regex.Matcher): calling childClassLoader.findClass()
14:10:05.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.regex.Matcher): class java.util.regex.Matcher
14:10:05.012 [main] DEBUG org.apache.ranger.plugin.resourcematcher.ResourceMatcher -- ==> setDelimiters(value= , startDelimiter={, endDelimiter=}, escapeChar=\, prefix=
14:10:05.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.StringTokenReplacer)
14:10:05.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.StringTokenReplacer): calling childClassLoader.findClass()
14:10:05.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.StringTokenReplacer)
14:10:05.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.StringTokenReplacer): calling childClassLoader().findClass() 
14:10:05.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.StringTokenReplacer): class org.apache.ranger.plugin.util.StringTokenReplacer
14:10:05.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.StringTokenReplacer): class org.apache.ranger.plugin.util.StringTokenReplacer
14:10:05.012 [main] DEBUG org.apache.ranger.plugin.resourcematcher.ResourceMatcher -- <== setDelimiters(value= , startDelimiter={, endDelimiter=}, escapeChar=\, prefix=
14:10:05.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$PriorityComparator)
14:10:05.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$PriorityComparator): calling childClassLoader.findClass()
14:10:05.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$PriorityComparator)
14:10:05.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$PriorityComparator): calling childClassLoader().findClass() 
14:10:05.013 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$PriorityComparator): class org.apache.ranger.plugin.resourcematcher.ResourceMatcher$PriorityComparator
14:10:05.013 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$PriorityComparator): class org.apache.ranger.plugin.resourcematcher.ResourceMatcher$PriorityComparator
14:10:05.013 [main] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- <== RangerAbstractResourceMatcher.init()
14:10:05.013 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPluginContext -- ==> setResourceMatcher(resourceDefName=keyname, resource=RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }, matcher=RangerDefaultResourceMatcher={RangerAbstractResourceMatcher={resourceDef={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} policyResource={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} optIgnoreCase={false} optQuotedCaseSensitive={false} optQuoteChars={"} optWildCard={true} policyValues={*,} policyIsExcludes={false} isMatchAny={true} options={wildCard=true;ignoreCase=false;} }})
14:10:05.013 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPluginContext -- <== setResourceMatcher(resourceDefName=keyname, resource=RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }, matcher=RangerDefaultResourceMatcher={RangerAbstractResourceMatcher={resourceDef={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} policyResource={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} optIgnoreCase={false} optQuotedCaseSensitive={false} optQuoteChars={"} optWildCard={true} policyValues={*,} policyIsExcludes={false} isMatchAny={true} options={wildCard=true;ignoreCase=false;} }})
14:10:05.014 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.createResourceMatcher(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }): RangerDefaultResourceMatcher={RangerAbstractResourceMatcher={resourceDef={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} policyResource={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} optIgnoreCase={false} optQuotedCaseSensitive={false} optQuoteChars={"} optWildCard={true} policyValues={*,} policyIsExcludes={false} isMatchAny={true} options={wildCard=true;ignoreCase=false;} }}
14:10:05.014 [main] DEBUG org.apache.ranger.perf.policyresourcematcher.init -- [PERF]:main:RangerDefaultPolicyResourceMatcher.init():10796706:10923997
14:10:05.014 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.init(): ret=true
14:10:05.014 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- <== RangerAbstractPolicyEvaluator.init(RangerPolicy={id={3} guid={dc09ffd7-0397-41c0-9ce9-8839b8f05e29} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}, RangerServiceDef={id={7} guid={c70ba7aa-1c2e-454e-834b-030fe061a1e0} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:09:29 UTC 2026} updateTime={Mon May 04 14:09:29 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} })
14:10:05.014 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator)
14:10:05.014 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator): calling childClassLoader.findClass()
14:10:05.014 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator)
14:10:05.014 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator): calling childClassLoader().findClass() 
14:10:05.015 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyItemEvaluator)
14:10:05.015 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyItemEvaluator): calling childClassLoader.findClass()
14:10:05.015 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyItemEvaluator)
14:10:05.015 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyItemEvaluator): calling childClassLoader().findClass() 
14:10:05.015 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyItemEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyItemEvaluator
14:10:05.015 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyItemEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyItemEvaluator
14:10:05.015 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator
14:10:05.015 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator
14:10:05.016 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator(policyId=3, policyItem=RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, serviceType=kms, conditionsDisabled=false)
14:10:05.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator)
14:10:05.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator): calling childClassLoader.findClass()
14:10:05.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator)
14:10:05.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator): calling childClassLoader().findClass() 
14:10:05.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator
14:10:05.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator
14:10:05.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator$SingletonHolder)
14:10:05.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator$SingletonHolder): calling childClassLoader.findClass()
14:10:05.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator$SingletonHolder)
14:10:05.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator$SingletonHolder): calling childClassLoader().findClass() 
14:10:05.016 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator$SingletonHolder): class org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator$SingletonHolder
14:10:05.017 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator$SingletonHolder): class org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator$SingletonHolder
14:10:05.017 [main] DEBUG org.apache.ranger.perf.policyitem.init -- [PERF]:main:RangerPolicyItemEvaluator.getPolicyItemConditionEvaluators(policyId=3, policyItemIndex=1):27009:28022
14:10:05.017 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator(policyId=3, conditionsCount=0)
14:10:05.017 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator(policyId=3, policyItem=RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, serviceType=kms, conditionsDisabled=false)
14:10:05.017 [main] DEBUG org.apache.ranger.perf.policyitem.init -- [PERF]:main:RangerPolicyItemEvaluator.getPolicyItemConditionEvaluators(policyId=3, policyItemIndex=2):15615:16555
14:10:05.017 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator(policyId=3, conditionsCount=0)
14:10:05.017 [main] DEBUG org.apache.ranger.perf.policy.init -- [PERF]:main:RangerPolicyEvaluator.getPolicyConditionEvaluators(policyId=3):11199:11627
14:10:05.017 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator$EvalOrderComparator)
14:10:05.017 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator$EvalOrderComparator): calling childClassLoader.findClass()
14:10:05.017 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator$EvalOrderComparator)
14:10:05.017 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator$EvalOrderComparator): calling childClassLoader().findClass() 
14:10:05.017 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator$EvalOrderComparator): class org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator$EvalOrderComparator
14:10:05.017 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator$EvalOrderComparator): class org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator$EvalOrderComparator
14:10:05.017 [main] DEBUG org.apache.ranger.perf.policy.init -- [PERF]:main:RangerPolicyEvaluator.init(policyId=3, policyName=all):18531262:18902615
14:10:05.018 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.init()
14:10:05.018 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator -- ==> RangerOptimizedPolicyEvaluator.checkIfHasAllPerms()
14:10:05.018 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator -- ==> RangerOptimizedPolicyEvaluator.checkIfHasAllPerms(), false
14:10:05.018 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator -- ==> RangerOptimizedPolicyEvaluator.computeEvalOrder()
14:10:05.018 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator$LevelResourceNames)
14:10:05.018 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator$LevelResourceNames): calling childClassLoader.findClass()
14:10:05.018 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator$LevelResourceNames)
14:10:05.018 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator$LevelResourceNames): calling childClassLoader().findClass() 
14:10:05.018 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator$LevelResourceNames): class org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator$LevelResourceNames
14:10:05.018 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator$LevelResourceNames): class org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator$LevelResourceNames
14:10:05.018 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator -- <== RangerOptimizedPolicyEvaluator.computeEvalOrder(), policyName:all, priority:9930
14:10:05.018 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator -- <== RangerOptimizedPolicyEvaluator.init()
14:10:05.019 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.buildPolicyEvaluator(RangerPolicy={id={3} guid={dc09ffd7-0397-41c0-9ce9-8839b8f05e29} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }},RangerServiceDef={id={7} guid={c70ba7aa-1c2e-454e-834b-030fe061a1e0} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:09:29 UTC 2026} updateTime={Mon May 04 14:09:29 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }): RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={3} guid={dc09ffd7-0397-41c0-9ce9-8839b8f05e29} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={{RangerDefaultResourceMatcher={RangerAbstractResourceMatcher={resourceDef={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} policyResource={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} optIgnoreCase={false} optQuotedCaseSensitive={false} optQuoteChars={"} optWildCard={true} policyValues={*,} policyIsExcludes={false} isMatchAny={true} options={wildCard=true;ignoreCase=false;} }}} } }} }
14:10:05.019 [main] INFO org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- This policy engine contains 1 policy evaluators
14:10:05.019 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- policy evaluation order: 1 policies
14:10:05.019 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- policy evaluation order: #1 - policy id=3; name=all; evalOrder=9930
14:10:05.019 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- dataMask policy evaluation order: 0 policies
14:10:05.019 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- rowFilter policy evaluation order: 0 policies
14:10:05.019 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- audit policy evaluation order: 0 policies
14:10:05.020 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.AuditFilter$AccessResult)
14:10:05.020 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.AuditFilter$AccessResult): calling childClassLoader.findClass()
14:10:05.020 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.AuditFilter$AccessResult)
14:10:05.020 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.AuditFilter$AccessResult): calling childClassLoader().findClass() 
14:10:05.020 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.AuditFilter$AccessResult): class org.apache.ranger.plugin.model.AuditFilter$AccessResult
14:10:05.020 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.AuditFilter$AccessResult): class org.apache.ranger.plugin.model.AuditFilter$AccessResult
14:10:05.024 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator)
14:10:05.024 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator): calling childClassLoader.findClass()
14:10:05.024 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator)
14:10:05.024 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator): calling childClassLoader().findClass() 
14:10:05.025 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator
14:10:05.025 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator
14:10:05.025 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItem)
14:10:05.025 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItem): calling childClassLoader.findClass()
14:10:05.025 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItem)
14:10:05.025 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItem): calling childClassLoader().findClass() 
14:10:05.026 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItem): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItem
14:10:05.026 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItem): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItem
14:10:05.026 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyEvaluator(auditFilter={accessResult=DENIED, resources=null, accessTypes=null, actions=null, users=null, groups=null, roles=null, isAudited=true}, priority=2, matchAnyResource=true)
14:10:05.026 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.init(2)
14:10:05.026 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.init()
14:10:05.026 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- ==> RangerAbstractPolicyEvaluator.init(RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}, RangerServiceDef={id={7} guid={c70ba7aa-1c2e-454e-834b-030fe061a1e0} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:09:29 UTC 2026} updateTime={Mon May 04 14:09:29 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} })
14:10:05.026 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- ==> RangerAbstractPolicyEvaluator.getPrunedPolicy(RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }})
14:10:05.026 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- <== RangerAbstractPolicyEvaluator.getPrunedPolicy(isPruningNeeded=false) : RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}
14:10:05.026 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.init()
14:10:05.026 [main] DEBUG org.apache.ranger.perf.policyresourcematcher.init -- [PERF]:main:RangerDefaultPolicyResourceMatcher.init():23119:24230
14:10:05.026 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.init(): ret=true
14:10:05.026 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- <== RangerAbstractPolicyEvaluator.init(RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}, RangerServiceDef={id={7} guid={c70ba7aa-1c2e-454e-834b-030fe061a1e0} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:09:29 UTC 2026} updateTime={Mon May 04 14:09:29 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} })
14:10:05.027 [main] DEBUG org.apache.ranger.perf.policy.init -- [PERF]:main:RangerPolicyEvaluator.getPolicyConditionEvaluators(policyId=2):18430:19630
14:10:05.027 [main] DEBUG org.apache.ranger.perf.policy.init -- [PERF]:main:RangerPolicyEvaluator.init(policyId=2, policyName=null):795308:796360
14:10:05.027 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.init()
14:10:05.027 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItemEvaluator)
14:10:05.027 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItemEvaluator): calling childClassLoader.findClass()
14:10:05.027 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItemEvaluator)
14:10:05.027 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItemEvaluator): calling childClassLoader().findClass() 
14:10:05.027 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItemEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItemEvaluator
14:10:05.027 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItemEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItemEvaluator
14:10:05.028 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator(RangerAuditPolicyItem={RangerPolicyItem={accessTypes={} users={} groups={} roles={} conditions={} delegateAdmin={false} } accessResult={DENIED} actions={} accessTypes={} isAudited={true}}, matchAnyUser=true, matchAnyAction=true, hasResourceOwner=false)
14:10:05.028 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.init(2)
14:10:05.028 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyEvaluator(auditFilter={accessResult=null, resources=null, accessTypes=null, actions=null, users=[keyadmin], groups=null, roles=null, isAudited=false}, priority=1, matchAnyResource=true)
14:10:05.028 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.init(1)
14:10:05.028 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.init()
14:10:05.028 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- ==> RangerAbstractPolicyEvaluator.init(RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}, RangerServiceDef={id={7} guid={c70ba7aa-1c2e-454e-834b-030fe061a1e0} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:09:29 UTC 2026} updateTime={Mon May 04 14:09:29 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} })
14:10:05.028 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- ==> RangerAbstractPolicyEvaluator.getPrunedPolicy(RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }})
14:10:05.028 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- <== RangerAbstractPolicyEvaluator.getPrunedPolicy(isPruningNeeded=false) : RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}
14:10:05.028 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.init()
14:10:05.028 [main] DEBUG org.apache.ranger.perf.policyresourcematcher.init -- [PERF]:main:RangerDefaultPolicyResourceMatcher.init():21738:23858
14:10:05.028 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.init(): ret=true
14:10:05.028 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- <== RangerAbstractPolicyEvaluator.init(RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}, RangerServiceDef={id={7} guid={c70ba7aa-1c2e-454e-834b-030fe061a1e0} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:09:29 UTC 2026} updateTime={Mon May 04 14:09:29 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} })
14:10:05.028 [main] DEBUG org.apache.ranger.perf.policy.init -- [PERF]:main:RangerPolicyEvaluator.getPolicyConditionEvaluators(policyId=1):14110:14946
14:10:05.028 [main] DEBUG org.apache.ranger.perf.policy.init -- [PERF]:main:RangerPolicyEvaluator.init(policyId=1, policyName=null):595818:597064
14:10:05.028 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.init()
14:10:05.028 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator(RangerAuditPolicyItem={RangerPolicyItem={accessTypes={} users={keyadmin } groups={} roles={} conditions={} delegateAdmin={false} } accessResult={null} actions={} accessTypes={} isAudited={false}}, matchAnyUser=false, matchAnyAction=true, hasResourceOwner=false)
14:10:05.028 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.init(1)
14:10:05.028 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie)
14:10:05.029 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie): calling childClassLoader.findClass()
14:10:05.029 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie)
14:10:05.029 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie): calling childClassLoader().findClass() 
14:10:05.030 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie): class org.apache.ranger.plugin.policyengine.RangerResourceTrie
14:10:05.030 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie): class org.apache.ranger.plugin.policyengine.RangerResourceTrie
14:10:05.030 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TraverseMatchHandler)
14:10:05.030 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TraverseMatchHandler): calling childClassLoader.findClass()
14:10:05.030 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TraverseMatchHandler)
14:10:05.030 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TraverseMatchHandler): calling childClassLoader().findClass() 
14:10:05.030 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TraverseMatchHandler): interface org.apache.ranger.plugin.policyengine.RangerResourceTrie$TraverseMatchHandler
14:10:05.030 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TraverseMatchHandler): interface org.apache.ranger.plugin.policyengine.RangerResourceTrie$TraverseMatchHandler
14:10:05.031 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie(keyname, evaluatorCount=1, isOptimizedForRetrieval=false, isOptimizedForSpace=false)
14:10:05.031 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> buildTrie(keyname, evaluatorCount=1, isMultiThreaded=false)
14:10:05.031 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieNode)
14:10:05.031 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieNode): calling childClassLoader.findClass()
14:10:05.031 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieNode)
14:10:05.031 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieNode): calling childClassLoader().findClass() 
14:10:05.031 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieNode): class org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieNode
14:10:05.031 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieNode): class org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieNode
14:10:05.032 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- [PERF]:main:RangerResourceTrie.init(resourceDef=keyname):1012999:1016083
14:10:05.032 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== buildTrie(keyname, evaluatorCount=1, isMultiThreaded=false) :nodeValue=ROOT; isSetup=false; isSharingParentWildcardEvaluators=false; childCount=0; evaluators=[]; wildcardEvaluators=[ 1|,|]
14:10:05.032 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- [PERF]:main:RangerResourceTrie.init(name=keyname):1360311:1362560
14:10:05.032 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieData)
14:10:05.032 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieData): calling childClassLoader.findClass()
14:10:05.032 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieData)
14:10:05.032 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieData): calling childClassLoader().findClass() 
14:10:05.032 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieData): class org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieData
14:10:05.032 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieData): class org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieData
14:10:05.032 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- resourceName=keyname; optIgnoreCase=false; optWildcard=true; wildcardChars=*?{}\$; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=1; evaluatorListRefCount=0; wildcardEvaluatorListRefCount=0
14:10:05.032 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie(keyname, evaluatorCount=1, isOptimizedForRetrieval=false, isOptimizedForSpace=false): resourceName=keyname; optIgnoreCase=false; optWildcard=true; wildcardChars=*?{}\$; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=1; evaluatorListRefCount=0; wildcardEvaluatorListRefCount=0
14:10:05.032 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie(keyname, evaluatorCount=0, isOptimizedForRetrieval=false, isOptimizedForSpace=false)
14:10:05.032 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> buildTrie(keyname, evaluatorCount=0, isMultiThreaded=false)
14:10:05.033 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- [PERF]:main:RangerResourceTrie.init(resourceDef=keyname):9361:10731
14:10:05.033 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== buildTrie(keyname, evaluatorCount=0, isMultiThreaded=false) :nodeValue=ROOT; isSetup=false; isSharingParentWildcardEvaluators=false; childCount=0; evaluators=[]; wildcardEvaluators=[ ]
14:10:05.033 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- [PERF]:main:RangerResourceTrie.init(name=keyname):124565:125087
14:10:05.033 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- resourceName=keyname; optIgnoreCase=false; optWildcard=true; wildcardChars=*?{}\$; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=0; evaluatorListRefCount=0; wildcardEvaluatorListRefCount=0
14:10:05.033 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie(keyname, evaluatorCount=0, isOptimizedForRetrieval=false, isOptimizedForSpace=false): resourceName=keyname; optIgnoreCase=false; optWildcard=true; wildcardChars=*?{}\$; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=0; evaluatorListRefCount=0; wildcardEvaluatorListRefCount=0
14:10:05.033 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie(keyname, evaluatorCount=0, isOptimizedForRetrieval=false, isOptimizedForSpace=false)
14:10:05.033 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> buildTrie(keyname, evaluatorCount=0, isMultiThreaded=false)
14:10:05.033 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- [PERF]:main:RangerResourceTrie.init(resourceDef=keyname):9762:11628
14:10:05.033 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== buildTrie(keyname, evaluatorCount=0, isMultiThreaded=false) :nodeValue=ROOT; isSetup=false; isSharingParentWildcardEvaluators=false; childCount=0; evaluators=[]; wildcardEvaluators=[ ]
14:10:05.033 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- [PERF]:main:RangerResourceTrie.init(name=keyname):110620:127509
14:10:05.033 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- resourceName=keyname; optIgnoreCase=false; optWildcard=true; wildcardChars=*?{}\$; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=0; evaluatorListRefCount=0; wildcardEvaluatorListRefCount=0
14:10:05.033 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie(keyname, evaluatorCount=0, isOptimizedForRetrieval=false, isOptimizedForSpace=false): resourceName=keyname; optIgnoreCase=false; optWildcard=true; wildcardChars=*?{}\$; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=0; evaluatorListRefCount=0; wildcardEvaluatorListRefCount=0
14:10:05.033 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie(keyname, evaluatorCount=2, isOptimizedForRetrieval=false, isOptimizedForSpace=false)
14:10:05.033 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> buildTrie(keyname, evaluatorCount=2, isMultiThreaded=false)
14:10:05.033 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- [PERF]:main:RangerResourceTrie.init(resourceDef=keyname):41728:44031
14:10:05.033 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== buildTrie(keyname, evaluatorCount=2, isMultiThreaded=false) :nodeValue=ROOT; isSetup=false; isSharingParentWildcardEvaluators=false; childCount=0; evaluators=[]; wildcardEvaluators=[ ]
14:10:05.033 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- [PERF]:main:RangerResourceTrie.init(name=keyname):154603:156160
14:10:05.033 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- resourceName=keyname; optIgnoreCase=false; optWildcard=true; wildcardChars=*?{}\$; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=0; evaluatorListRefCount=0; wildcardEvaluatorListRefCount=0
14:10:05.033 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie(keyname, evaluatorCount=2, isOptimizedForRetrieval=false, isOptimizedForSpace=false): resourceName=keyname; optIgnoreCase=false; optWildcard=true; wildcardChars=*?{}\$; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=0; evaluatorListRefCount=0; wildcardEvaluatorListRefCount=0
14:10:05.033 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> RangerServiceDefHelper(). The RangerServiceDef: RangerServiceDef={id={7} guid={c70ba7aa-1c2e-454e-834b-030fe061a1e0} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:09:29 UTC 2026} updateTime={Mon May 04 14:09:29 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }
14:10:05.033 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Created graph for resources: _nodes={keyname=[]}
14:10:05.033 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Returning sources: [keyname]
14:10:05.033 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Returning sinks: [keyname]
14:10:05.033 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Returning sources: [keyname]
14:10:05.033 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Returning sinks: [keyname]
14:10:05.033 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Created graph for resources: null
14:10:05.034 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Created graph for resources: null
14:10:05.034 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Found [3] resource hierarchies for service [kms] update-date[Mon May 04 14:09:29 UTC 2026]: {0=[[RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]], 1=[], 2=[]}
14:10:05.034 [main] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- PolicyEngine : No tag-policy-repository for service kms
14:10:05.034 [main] DEBUG org.apache.ranger.perf.policyengine.init -- [PERF]:main:RangerPolicyEngine.init(hashCode=6e91e82e):64632289:66483826
14:10:05.034 [main] DEBUG org.apache.ranger.perf.policyengine.init -- In-Use memory: 194688664, Free memory:113592680
14:10:05.034 [main] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- <== PolicyEngine()
14:10:05.034 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$ServiceConfig)
14:10:05.034 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$ServiceConfig): calling childClassLoader.findClass()
14:10:05.034 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$ServiceConfig)
14:10:05.034 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$ServiceConfig): calling childClassLoader().findClass() 
14:10:05.034 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$ServiceConfig): class org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$ServiceConfig
14:10:05.034 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$ServiceConfig): class org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$ServiceConfig
14:10:05.034 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.service.RangerDefaultRequestProcessor)
14:10:05.034 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.service.RangerDefaultRequestProcessor): calling childClassLoader.findClass()
14:10:05.035 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.service.RangerDefaultRequestProcessor)
14:10:05.035 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.service.RangerDefaultRequestProcessor): calling childClassLoader().findClass() 
14:10:05.035 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.service.RangerDefaultRequestProcessor): class org.apache.ranger.plugin.service.RangerDefaultRequestProcessor
14:10:05.035 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.service.RangerDefaultRequestProcessor): class org.apache.ranger.plugin.service.RangerDefaultRequestProcessor
14:10:05.035 [main] INFO org.apache.ranger.plugin.service.RangerBasePlugin -- Switching policy engine from [-1]
14:10:05.035 [main] INFO org.apache.ranger.plugin.service.RangerBasePlugin -- Switched policy engine to [4]
14:10:05.035 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- ==> PolicyRefresher(serviceName=kms).saveToCache()
14:10:05.070 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.FileFilter)
14:10:05.070 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.FileFilter): calling childClassLoader.findClass()
14:10:05.070 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.FileFilter): interface java.io.FileFilter
14:10:05.071 [main] INFO org.apache.ranger.plugin.util.PolicyRefresher -- No files matching '.+json_*' found
14:10:05.071 [main] DEBUG org.apache.ranger.perf.policyengine.init -- [PERF]:main:PolicyRefresher.saveToCache(serviceName=kms):35129614:35611870
14:10:05.071 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- <== PolicyRefresher(serviceName=kms).saveToCache()
14:10:05.071 [main] DEBUG org.apache.ranger.plugin.service.RangerBasePlugin -- <== setPolicies(serviceName=kms, serviceId=3, policyVersion=4, policyUpdateTime=Mon May 04 14:09:47 UTC 2026, policies=[RangerPolicy={id={3} guid={dc09ffd7-0397-41c0-9ce9-8839b8f05e29} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}], tagPolicies=null, policyDeltas=null, serviceDef=RangerServiceDef={id={7} guid={c70ba7aa-1c2e-454e-834b-030fe061a1e0} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:09:29 UTC 2026} updateTime={Mon May 04 14:09:29 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }, auditMode=audit-default, securityZones=null)
14:10:05.071 [main] DEBUG org.apache.ranger.perf.policyengine.init -- [PERF]:main:PolicyRefresher.loadPolicy(serviceName=kms):327934834:435173096
14:10:05.071 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- <== PolicyRefresher(serviceName=kms).loadPolicy()
14:10:05.074 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Timer)
14:10:05.074 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Timer): calling childClassLoader.findClass()
14:10:05.074 [PolicyRefresher(serviceName=kms)-24] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- ==> PolicyRefresher(serviceName=kms).run()
14:10:05.074 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Timer): class java.util.Timer
14:10:05.078 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- Scheduled policyDownloadRefresher to download policies every 30000 milliseconds
14:10:05.078 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.audit.RangerDefaultAuditHandler)
14:10:05.078 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.audit.RangerDefaultAuditHandler): calling childClassLoader.findClass()
14:10:05.078 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.audit.RangerDefaultAuditHandler)
14:10:05.078 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.audit.RangerDefaultAuditHandler): calling childClassLoader().findClass() 
14:10:05.079 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.audit.RangerDefaultAuditHandler): class org.apache.ranger.plugin.audit.RangerDefaultAuditHandler
14:10:05.079 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.audit.RangerDefaultAuditHandler): class org.apache.ranger.plugin.audit.RangerDefaultAuditHandler
14:10:05.079 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.model.AuditEventBase)
14:10:05.079 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.model.AuditEventBase): calling childClassLoader.findClass()
14:10:05.079 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.model.AuditEventBase)
14:10:05.079 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.model.AuditEventBase): calling childClassLoader().findClass() 
14:10:05.079 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.model.AuditEventBase): class org.apache.ranger.audit.model.AuditEventBase
14:10:05.079 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.model.AuditEventBase): class org.apache.ranger.audit.model.AuditEventBase
14:10:05.079 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.model.AuthzAuditEvent)
14:10:05.079 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.model.AuthzAuditEvent): calling childClassLoader.findClass()
14:10:05.079 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.model.AuthzAuditEvent)
14:10:05.079 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.model.AuthzAuditEvent): calling childClassLoader().findClass() 
14:10:05.080 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.model.AuthzAuditEvent): class org.apache.ranger.audit.model.AuthzAuditEvent
14:10:05.080 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.model.AuthzAuditEvent): class org.apache.ranger.audit.model.AuthzAuditEvent
14:10:05.080 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.init()
14:10:05.080 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:05.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:05.081 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.init()
14:10:05.081 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.RangerKmsAuthorizer()
14:10:05.081 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.startReloader()
14:10:05.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:05.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:05.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.Executors)
14:10:05.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.Executors): calling childClassLoader.findClass()
14:10:05.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.Executors): class java.util.concurrent.Executors
14:10:05.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.TimeUnit)
14:10:05.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.TimeUnit): calling childClassLoader.findClass()
14:10:05.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.TimeUnit): class java.util.concurrent.TimeUnit
14:10:05.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.ScheduledExecutorService)
14:10:05.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.ScheduledExecutorService): calling childClassLoader.findClass()
14:10:05.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.ScheduledExecutorService): interface java.util.concurrent.ScheduledExecutorService
14:10:05.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:05.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:05.082 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.startReloader()
14:10:05.106 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSAudit -- No audit logger configured, using default.
14:10:05.108 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSAudit -- Initializing audit logger class org.apache.hadoop.crypto.key.kms.server.SimpleKMSAuditLogger
14:10:05.110 [main] INFO org.apache.ranger.kms.metrics.KMSMetricWrapper -- Creating KMSMetricWrapper with thread-safe value=false
14:10:05.113 [main] DEBUG org.apache.ranger.kms.metrics.KMSMetricWrapper -- ===>> KMSMetricWrapper.init()
14:10:05.115 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- from system property: null
14:10:05.115 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- from environment variable: null
14:10:05.191 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- Could not locate file hadoop-metrics2-kms.properties
org.apache.commons.configuration2.ex.ConfigurationException: Could not locate: FileLocator [basePath=null, encoding=null, fileName=hadoop-metrics2-kms.properties, fileSystem=null, locationStrategy=null, sourceURL=null, urlConnectionOptions=null]
	at org.apache.commons.configuration2.io.FileLocatorUtils.locateOrThrow(FileLocatorUtils.java:484)
	at org.apache.commons.configuration2.io.FileHandler.load(FileHandler.java:606)
	at org.apache.commons.configuration2.io.FileHandler.load(FileHandler.java:579)
	at org.apache.hadoop.metrics2.impl.MetricsConfig.loadFirst(MetricsConfig.java:118)
	at org.apache.hadoop.metrics2.impl.MetricsConfig.create(MetricsConfig.java:97)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.configure(MetricsSystemImpl.java:482)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.start(MetricsSystemImpl.java:188)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.init(MetricsSystemImpl.java:163)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.init(DefaultMetricsSystem.java:62)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.initialize(DefaultMetricsSystem.java:58)
	at org.apache.ranger.metrics.RangerMetricsSystemWrapper.init(RangerMetricsSystemWrapper.java:56)
	at org.apache.ranger.kms.metrics.KMSMetricWrapper.init(KMSMetricWrapper.java:80)
	at org.apache.ranger.kms.metrics.KMSMetricWrapper.<init>(KMSMetricWrapper.java:53)
	at org.apache.ranger.kms.metrics.KMSMetricWrapper.getInstance(KMSMetricWrapper.java:61)
	at org.apache.hadoop.crypto.key.kms.server.KMSWebApp.contextInitialized(KMSWebApp.java:168)
	at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4018)
	at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:4460)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1203)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1193)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:145)
	at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:749)
	at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:721)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1203)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1193)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:145)
	at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:749)
	at org.apache.catalina.core.StandardEngine.startInternal(StandardEngine.java:211)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.StandardService.startInternal(StandardService.java:415)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.StandardServer.startInternal(StandardServer.java:874)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.startup.Tomcat.start(Tomcat.java:438)
	at org.apache.ranger.server.tomcat.EmbeddedServer.startServer(EmbeddedServer.java:351)
	at org.apache.ranger.server.tomcat.EmbeddedServer.start(EmbeddedServer.java:317)
	at org.apache.ranger.server.tomcat.EmbeddedServer.main(EmbeddedServer.java:95)
14:10:05.192 [main] DEBUG org.apache.commons.configuration2.io.FileLocatorUtils -- Loading configuration from the context classpath (hadoop-metrics2.properties)
14:10:05.222 [main] INFO org.apache.hadoop.metrics2.impl.MetricsConfig -- Loaded properties from hadoop-metrics2.properties
14:10:05.225 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- Properties: *.period = 30

14:10:05.225 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- Metrics Config: 
14:10:05.227 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: period
14:10:05.228 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: periodMillis
14:10:05.231 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field org.apache.hadoop.metrics2.lib.MutableCounterLong org.apache.hadoop.metrics2.impl.MetricsSystemImpl.droppedPubAll with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Dropped updates by all sinks"})
14:10:05.232 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field org.apache.hadoop.metrics2.lib.MutableStat org.apache.hadoop.metrics2.impl.MetricsSystemImpl.publishStat with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Publish", "Publishing stats"})
14:10:05.232 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field org.apache.hadoop.metrics2.lib.MutableStat org.apache.hadoop.metrics2.impl.MetricsSystemImpl.snapshotStat with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Snapshot", "Snapshot stats"})
14:10:05.235 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: source.source.start_mbeans
14:10:05.235 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: source.start_mbeans
14:10:05.235 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.source.start_mbeans
14:10:05.244 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating attr cache...
14:10:05.244 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done. # tags & metrics=10
14:10:05.244 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating info cache...
14:10:05.244 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- [javax.management.MBeanAttributeInfo[description=Metrics context, name=tag.Context, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of active metrics sources, name=NumActiveSources, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of all registered metrics sources, name=NumAllSources, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of active metrics sinks, name=NumActiveSinks, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of all registered metrics sinks, name=NumAllSinks, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Dropped updates by all sinks, name=DroppedPubAll, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of ops for publishing stats, name=PublishNumOps, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Average time for publishing stats, name=PublishAvgTime, type=java.lang.Double, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of ops for snapshot stats, name=SnapshotNumOps, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Average time for snapshot stats, name=SnapshotAvgTime, type=java.lang.Double, read-only, descriptor={}]]
14:10:05.256 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done
14:10:05.256 [main] DEBUG org.apache.hadoop.metrics2.util.MBeans -- Registered Hadoop:service=kms,name=MetricsSystem,sub=Stats
14:10:05.256 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- MBean for source MetricsSystem,sub=Stats registered.
14:10:05.256 [main] INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Scheduled Metric snapshot period at 30 second(s).
14:10:05.256 [main] INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- kms metrics system started
14:10:05.257 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: source.source.start_mbeans
14:10:05.257 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: source.start_mbeans
14:10:05.257 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.source.start_mbeans
14:10:05.258 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating attr cache...
14:10:05.258 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done. # tags & metrics=10
14:10:05.258 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating info cache...
14:10:05.258 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- [javax.management.MBeanAttributeInfo[description=Metrics context, name=tag.Context, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Local hostname, name=tag.Hostname, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of ops for getGroups, name=GetGroupsNumOps, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Average time for getGroups, name=GetGroupsAvgTime, type=java.lang.Double, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of ops for rate of failed kerberos logins and latency (milliseconds), name=LoginFailureNumOps, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Average time for rate of failed kerberos logins and latency (milliseconds), name=LoginFailureAvgTime, type=java.lang.Double, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of ops for rate of successful kerberos logins and latency (milliseconds), name=LoginSuccessNumOps, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Average time for rate of successful kerberos logins and latency (milliseconds), name=LoginSuccessAvgTime, type=java.lang.Double, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Renewal failures since last successful login, name=RenewalFailures, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Renewal failures since startup, name=RenewalFailuresTotal, type=java.lang.Long, read-only, descriptor={}]]
14:10:05.258 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done
14:10:05.258 [main] DEBUG org.apache.hadoop.metrics2.util.MBeans -- Registered Hadoop:service=kms,name=UgiMetrics
14:10:05.258 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- MBean for source UgiMetrics registered.
14:10:05.258 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Registered source UgiMetrics
14:10:05.259 [main] DEBUG org.apache.hadoop.metrics2.util.MBeans -- Registered Hadoop:service=kms,name=MetricsSystem,sub=Control
14:10:05.261 [main] INFO org.apache.ranger.server.tomcat.EmbeddedServer -- Selected Tomcat protocolHandler: "http-nio-34499"
14:10:05.261 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- KMSMetricSource, KMS metrics
14:10:05.261 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: source.source.start_mbeans
14:10:05.261 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: source.start_mbeans
14:10:05.261 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.source.start_mbeans
14:10:05.263 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=KEY_CREATE_COUNT  , value=0 , type=COUNTER
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=KEY_CREATE_ELAPSED_TIME  , value=0 , type=GAUGE
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=EEK_DECRYPT_COUNT  , value=0 , type=COUNTER
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=EEK_DECRYPT_ELAPSED_TIME  , value=0 , type=GAUGE
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=EEK_GENERATE_COUNT  , value=0 , type=COUNTER
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=EEK_GENERATE_ELAPSED_TIME  , value=0 , type=GAUGE
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=EEK_REENCRYPT_COUNT  , value=0 , type=COUNTER
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=EEK_REENCRYPT_ELAPSED_TIME  , value=0 , type=GAUGE
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=REENCRYPT_EEK_BATCH_COUNT  , value=0 , type=COUNTER
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=REENCRYPT_EEK_BATCH_ELAPSED_TIME  , value=0 , type=GAUGE
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=REENCRYPT_EEK_BATCH_KEYS_COUNT  , value=0 , type=COUNTER
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=DELETE_KEY_COUNT  , value=0 , type=COUNTER
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=DELETE_KEY_ELAPSED_TIME  , value=0 , type=GAUGE
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=ROLL_NEW_VERSION_COUNT  , value=0 , type=COUNTER
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=ROLL_NEW_VERSION_ELAPSED_TIME  , value=0 , type=GAUGE
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=INVALIDATE_CACHE_COUNT  , value=0 , type=COUNTER
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=INVALIDATE_CACHE_ELAPSED_TIME  , value=0 , type=GAUGE
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_KEYS_METADATA_COUNT  , value=0 , type=COUNTER
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_KEYS_METADATA_ELAPSED_TIME  , value=0 , type=GAUGE
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_KEYS_METADATA_KEYNAMES_COUNT  , value=0 , type=COUNTER
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_KEYS_COUNT  , value=0 , type=COUNTER
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_KEYS_ELAPSED_TIME  , value=0 , type=GAUGE
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_METADATA_COUNT  , value=0 , type=COUNTER
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_METADATA_ELAPSED_TIME  , value=0 , type=GAUGE
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_CURRENT_KEY_COUNT  , value=0 , type=COUNTER
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_CURRENT_KEY_ELAPSED_TIME  , value=0 , type=GAUGE
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_KEY_VERSION_COUNT  , value=0 , type=COUNTER
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_KEY_VERSION_ELAPSED_TIME  , value=0 , type=GAUGE
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_KEY_VERSIONS_COUNT  , value=0 , type=COUNTER
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_KEY_VERSIONS_ELAPSED_TIME  , value=0 , type=GAUGE
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=UNAUTHENTICATED_CALLS_COUNT  , value=0 , type=COUNTER
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=UNAUTHORIZED_CALLS_COUNT  , value=0 , type=COUNTER
14:10:05.264 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=TOTAL_CALL_COUNT  , value=0 , type=COUNTER
14:10:05.264 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating attr cache...
14:10:05.264 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done. # tags & metrics=37
14:10:05.264 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating info cache...
14:10:05.265 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- [javax.management.MBeanAttributeInfo[description=Metrics context, name=tag.Context, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Local hostname, name=tag.Hostname, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=KEY_CREATE_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=KEY_CREATE_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=EEK_DECRYPT_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=EEK_DECRYPT_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=EEK_GENERATE_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=EEK_GENERATE_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=EEK_REENCRYPT_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=EEK_REENCRYPT_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=REENCRYPT_EEK_BATCH_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=REENCRYPT_EEK_BATCH_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=REENCRYPT_EEK_BATCH_KEYS_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=DELETE_KEY_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=DELETE_KEY_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=ROLL_NEW_VERSION_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=ROLL_NEW_VERSION_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=INVALIDATE_CACHE_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=INVALIDATE_CACHE_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_KEYS_METADATA_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_KEYS_METADATA_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_KEYS_METADATA_KEYNAMES_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_KEYS_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_KEYS_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_METADATA_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_METADATA_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_CURRENT_KEY_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_CURRENT_KEY_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_KEY_VERSION_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_KEY_VERSION_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_KEY_VERSIONS_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_KEY_VERSIONS_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=UNAUTHENTICATED_CALLS_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=UNAUTHORIZED_CALLS_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=TOTAL_CALL_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=UNAUTHENTICATED_CALLS_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=UNAUTHORIZED_CALLS_COUNT, type=java.lang.Long, read-only, descriptor={}]]
14:10:05.265 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done
14:10:05.265 [main] DEBUG org.apache.hadoop.metrics2.util.MBeans -- Registered Hadoop:service=kms,name=KMSMetricSource
14:10:05.265 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- MBean for source KMSMetricSource registered.
14:10:05.265 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Registered source KMSMetricSource
14:10:05.266 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- RangerJVM, Ranger common metric source (RangerMetricsJvmSource)
14:10:05.266 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: source.source.start_mbeans
14:10:05.266 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: source.start_mbeans
14:10:05.266 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.source.start_mbeans
14:10:05.269 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating attr cache...
14:10:05.269 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done. # tags & metrics=13
14:10:05.269 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating info cache...
14:10:05.270 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- [javax.management.MBeanAttributeInfo[description=Metrics context, name=tag.Context, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Local hostname, name=tag.Hostname, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger current memory utilization, name=MemoryCurrent, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger max memory utilization, name=MemoryMax, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger app total GCs, name=GcCountTotal, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger app total GC time, name=GcTimeTotal, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger app MAX GC time, name=GcTimeMax, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger busy threads, name=ThreadsBusy, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger blocked threads, name=ThreadsBlocked, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger waiting threads, name=ThreadsWaiting, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger remaining threads, name=ThreadsRemaining, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger Processors available, name=ProcessorsAvailable, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger System Load Average, name=SystemLoadAvg, type=java.lang.Float, read-only, descriptor={}]]
14:10:05.270 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done
14:10:05.270 [main] DEBUG org.apache.hadoop.metrics2.util.MBeans -- Registered Hadoop:service=kms,name=RangerJVM
14:10:05.270 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- MBean for source RangerJVM registered.
14:10:05.270 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Registered source RangerJVM
14:10:05.271 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- RangerContainer, Ranger web container metric source (RangerMetricsContainerSource)
14:10:05.271 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: source.source.start_mbeans
14:10:05.271 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: source.start_mbeans
14:10:05.271 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.source.start_mbeans
14:10:05.273 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating attr cache...
14:10:05.273 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done. # tags & metrics=11
14:10:05.273 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating info cache...
14:10:05.273 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- [javax.management.MBeanAttributeInfo[description=Metrics context, name=tag.Context, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Local hostname, name=tag.Hostname, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger max configured container connections, name=MaxConnectionsCount, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger active container connections, name=ActiveConnectionsCount, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger accept connections count, name=ConnectionAcceptCount, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger connection timeout, name=ConnectionTimeout, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger connection keepAlive timeout, name=KeepAliveTimeout, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger container worker threads count, name=MaxWorkerThreadsCount, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger container minimum spare worker threads count, name=MinSpareWorkerThreadsCount, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger container active worker threads count, name=ActiveWorkerThreadsCount, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger container total worker threads count, name=TotalWorkerThreadsCount, type=java.lang.Integer, read-only, descriptor={}]]
14:10:05.273 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done
14:10:05.273 [main] DEBUG org.apache.hadoop.metrics2.util.MBeans -- Registered Hadoop:service=kms,name=RangerContainer
14:10:05.273 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- MBean for source RangerContainer registered.
14:10:05.273 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Registered source RangerContainer
14:10:05.276 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Prometheus, Ranger common metric sink (RangerMetricsPrometheusSink)
14:10:05.278 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.context
14:10:05.278 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: context
14:10:05.278 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.context
14:10:05.278 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.period
14:10:05.278 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: period
14:10:05.278 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.period
14:10:05.278 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.queue.capacity
14:10:05.278 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: queue.capacity
14:10:05.278 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.queue.capacity
14:10:05.278 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.retry.delay
14:10:05.278 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: retry.delay
14:10:05.278 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.retry.delay
14:10:05.278 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.retry.backoff
14:10:05.278 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: retry.backoff
14:10:05.278 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.retry.backoff
14:10:05.278 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.retry.count
14:10:05.278 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: retry.count
14:10:05.278 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.retry.count
14:10:05.280 [main] INFO org.apache.hadoop.metrics2.impl.MetricsSinkAdapter -- Sink Prometheus started
14:10:05.280 [main] INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Registered sink Prometheus
14:10:05.281 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Json, Ranger common metric sink (RangerMetricsJsonSink)
14:10:05.281 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.context
14:10:05.281 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: context
14:10:05.281 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.context
14:10:05.281 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.period
14:10:05.281 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: period
14:10:05.281 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.period
14:10:05.281 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.queue.capacity
14:10:05.281 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: queue.capacity
14:10:05.281 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.queue.capacity
14:10:05.281 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.retry.delay
14:10:05.281 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: retry.delay
14:10:05.281 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.retry.delay
14:10:05.281 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.retry.backoff
14:10:05.281 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: retry.backoff
14:10:05.281 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.retry.backoff
14:10:05.281 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.retry.count
14:10:05.281 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: retry.count
14:10:05.281 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.retry.count
14:10:05.286 [main] INFO org.apache.hadoop.metrics2.impl.MetricsSinkAdapter -- Sink Json started
14:10:05.286 [main] INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Registered sink Json
14:10:05.286 [main] INFO org.apache.ranger.metrics.RangerMetricsSystemWrapper -- ===>> Ranger Metric system initialized successfully.
14:10:05.286 [main] DEBUG org.apache.ranger.kms.metrics.KMSMetricWrapper -- <<=== KMSMetricWrapper.init()
14:10:05.286 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- ------------------ Ranger KMSWebApp---------------------
14:10:05.286 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- provider string = dbks://http@localhost:9292/kms
14:10:05.286 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- URI = dbks://http@localhost:9292/kms scheme = dbks
14:10:05.286 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- kmsconf size= 344 kms classname=org.apache.hadoop.conf.Configuration
14:10:05.286 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- ----------------Instantiating key provider ---------------
14:10:05.299 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> createProvider(dbks://http@localhost:9292/kms)
14:10:05.299 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> RangerKeyStoreProvider(conf)
14:10:05.299 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getConfiguration()
14:10:05.300 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getConfiguration()
14:10:05.300 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getFromJceks()
14:10:05.330 [main] DEBUG org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider -- backing jks path initialized to file:/root/ranger-2.1.0-kms/ews/webapp/WEB-INF/classes/conf/.jceks/rangerkms.jceks
14:10:05.349 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Starting: Acquiring creator semaphore for file:///root/ranger-2.1.0-kms/ews/webapp/WEB-INF/classes/conf/.jceks/rangerkms.jceks
14:10:05.349 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Acquiring creator semaphore for file:///root/ranger-2.1.0-kms/ews/webapp/WEB-INF/classes/conf/.jceks/rangerkms.jceks: duration 0:00.001s
14:10:05.352 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Starting: Creating FS file:///root/ranger-2.1.0-kms/ews/webapp/WEB-INF/classes/conf/.jceks/rangerkms.jceks
14:10:05.352 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Loading filesystems
14:10:05.376 [main] DEBUG org.apache.hadoop.fs.FileSystem -- file:// = class org.apache.hadoop.fs.LocalFileSystem from /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:10:05.383 [main] DEBUG org.apache.hadoop.fs.FileSystem -- viewfs:// = class org.apache.hadoop.fs.viewfs.ViewFileSystem from /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:10:05.386 [main] DEBUG org.apache.hadoop.fs.FileSystem -- har:// = class org.apache.hadoop.fs.HarFileSystem from /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:10:05.389 [main] DEBUG org.apache.hadoop.fs.FileSystem -- http:// = class org.apache.hadoop.fs.http.HttpFileSystem from /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:10:05.390 [main] DEBUG org.apache.hadoop.fs.FileSystem -- https:// = class org.apache.hadoop.fs.http.HttpsFileSystem from /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:10:05.391 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Looking for FS supporting file
14:10:05.391 [main] DEBUG org.apache.hadoop.fs.FileSystem -- looking for configuration option fs.file.impl
14:10:05.391 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Looking in service filesystems for implementation class
14:10:05.391 [main] DEBUG org.apache.hadoop.fs.FileSystem -- FS for file is class org.apache.hadoop.fs.LocalFileSystem
14:10:05.397 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Creating FS file:///root/ranger-2.1.0-kms/ews/webapp/WEB-INF/classes/conf/.jceks/rangerkms.jceks: duration 0:00.045s
14:10:05.402 [main] INFO org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- Credential keystore password not applied for KMS; clear text password shall be applicable
14:10:05.402 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getFromJceks()
14:10:05.402 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getFromJceks()
14:10:05.405 [main] DEBUG org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider -- backing jks path initialized to file:/root/ranger-2.1.0-kms/ews/webapp/WEB-INF/classes/conf/.jceks/rangerkms.jceks
14:10:05.406 [main] INFO org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- Credential keystore password not applied for KMS; clear text password shall be applicable
14:10:05.406 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getFromJceks()
14:10:05.406 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getFromJceks()
14:10:05.409 [main] DEBUG org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider -- backing jks path initialized to file:/root/ranger-2.1.0-kms/ews/webapp/WEB-INF/classes/conf/.jceks/rangerkms.jceks
14:10:05.409 [main] INFO org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- Credential keystore password not applied for KMS; clear text password shall be applicable
14:10:05.409 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getFromJceks()
14:10:05.409 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getFromJceks()
14:10:05.413 [main] DEBUG org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider -- backing jks path initialized to file:/root/ranger-2.1.0-kms/ews/webapp/WEB-INF/classes/conf/.jceks/rangerkms.jceks
14:10:05.413 [main] INFO org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- Credential keystore password not applied for KMS; clear text password shall be applicable
14:10:05.413 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getFromJceks()
14:10:05.413 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getFromJceks()
14:10:05.413 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getFromJceks()
14:10:07.022 [main] INFO org.apache.hadoop.crypto.key.RangerKMSDB -- Connected to DB : false
14:10:07.022 [main] INFO org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- Ranger KMS Database is enabled for storing master key.
14:10:07.028 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> generateAndGetMasterKey()
14:10:07.028 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.generateMasterKey()
14:10:07.029 [main] INFO org.apache.hadoop.crypto.key.RangerMasterKey -- Generating Master Key...
14:10:07.029 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.init()
14:10:07.037 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.init()
14:10:07.137 [main] INFO org.apache.hadoop.crypto.key.RangerMasterKey -- Master Key doesn't exist in DB, Generating the Master Key
14:10:07.137 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.encryptMasterKey()
14:10:07.137 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.generateMasterKey()
14:10:07.137 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.getPBEParameterSpec()
14:10:07.138 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.encryptKey()
14:10:07.138 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.getPasswordKey()
14:10:07.138 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.getPasswordKey()
14:10:07.152 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.encryptKey()
14:10:07.152 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.encryptMasterKey()
14:10:07.159 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.saveEncryptedMK()
14:10:07.188 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.saveEncryptedMK()
14:10:07.188 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- Master Key Created with id = 1
14:10:07.188 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.generateMasterKey()
14:10:07.188 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.getMasterKey()
14:10:07.188 [main] INFO org.apache.hadoop.crypto.key.RangerMasterKey -- Getting Master Key
14:10:07.188 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.getEncryptedMK()
14:10:07.206 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.getEncryptedMK()
14:10:07.208 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.getMasterKey()
14:10:07.208 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.decryptMasterKey()
14:10:07.208 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- Decrypting Master Key...
14:10:07.208 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.getPBEParameterSpec()
14:10:07.209 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.getPasswordKey()
14:10:07.209 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.getPasswordKey()
14:10:07.211 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.decryptMasterKey()
14:10:07.211 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== generateAndGetMasterKey()
14:10:07.211 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> reloadKeys()
14:10:07.212 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> loadKeys()
14:10:07.212 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> engineLoad()
14:10:07.212 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> dbOperationLoad()
14:10:07.215 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== dbOperationLoad(): count=0
14:10:07.215 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- RangerKeyStore might be null or key is not present in the database.
14:10:07.215 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== loadKeys()
14:10:07.215 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== reloadKeys()
14:10:07.215 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== createProvider(dbks://http@localhost:9292/kms)
14:10:07.215 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- keyProvider = org.apache.hadoop.crypto.key.RangerKeyStoreProvider@6fe6c99a
14:10:07.220 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- Initialized KeyProvider CachingKeyProvider: org.apache.hadoop.crypto.key.RangerKeyStoreProvider@6fe6c99a
14:10:07.228 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- Initialized KeyProviderCryptoExtension org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider:EagerKeyGeneratorKeyProviderCryptoExtension: KeyProviderCryptoExtension: CachingKeyProvider: org.apache.hadoop.crypto.key.RangerKeyStoreProvider@6fe6c99a
14:10:07.228 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- Default key bitlength is 128
14:10:07.228 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- Ranger KMS Started
14:10:07.284 [main] DEBUG org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager$DelegationTokenSecretManagerMetrics -- Initialized MetricsRegistry{info=MetricsInfoImpl{name=DelegationTokenSecretManagerMetrics, description=DelegationTokenSecretManagerMetrics}, tags=[], metrics=[]}
14:10:07.285 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field private org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager$DelegationTokenSecretManagerMetrics.removeToken with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Rate of removal of delegation tokens and latency (milliseconds)"})
14:10:07.285 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field private org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager$DelegationTokenSecretManagerMetrics.storeToken with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Rate of storage of delegation tokens and latency (milliseconds)"})
14:10:07.286 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field private org.apache.hadoop.metrics2.lib.MutableCounterLong org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager$DelegationTokenSecretManagerMetrics.tokenFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Counter of delegation tokens operation failures"})
14:10:07.286 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field private org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager$DelegationTokenSecretManagerMetrics.updateToken with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Rate of update of delegation tokens and latency (milliseconds)"})
14:10:07.287 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- DelegationTokenSecretManagerMetrics, Delegation token secret manager metrics
14:10:07.287 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: source.source.start_mbeans
14:10:07.287 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: source.start_mbeans
14:10:07.287 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.source.start_mbeans
14:10:07.287 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating attr cache...
14:10:07.288 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done. # tags & metrics=9
14:10:07.288 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating info cache...
14:10:07.288 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- [javax.management.MBeanAttributeInfo[description=Metrics context, name=tag.Context, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Local hostname, name=tag.Hostname, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of ops for rate of removal of delegation tokens and latency (milliseconds), name=RemoveTokenNumOps, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Average time for rate of removal of delegation tokens and latency (milliseconds), name=RemoveTokenAvgTime, type=java.lang.Double, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of ops for rate of storage of delegation tokens and latency (milliseconds), name=StoreTokenNumOps, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Average time for rate of storage of delegation tokens and latency (milliseconds), name=StoreTokenAvgTime, type=java.lang.Double, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Counter of delegation tokens operation failures, name=TokenFailure, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of ops for rate of update of delegation tokens and latency (milliseconds), name=UpdateTokenNumOps, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Average time for rate of update of delegation tokens and latency (milliseconds), name=UpdateTokenAvgTime, type=java.lang.Double, read-only, descriptor={}]]
14:10:07.288 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done
14:10:07.288 [main] DEBUG org.apache.hadoop.metrics2.util.MBeans -- Registered Hadoop:service=kms,name=DelegationTokenSecretManagerMetrics
14:10:07.288 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- MBean for source DelegationTokenSecretManagerMetrics registered.
14:10:07.288 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Registered source DelegationTokenSecretManagerMetrics
14:10:07.291 [main] INFO org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler -- Using keytab /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/HTTP_127.25.254.212@KRBTEST.COM.keytab, for principal HTTP/127.25.254.212@KRBTEST.COM
14:10:07.297 [main] INFO org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager -- Updating the current master key for generating delegation tokens
14:10:07.300 [Thread[Thread-14,5,main]] INFO org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager -- Starting expired delegation token remover thread, tokenRemoverScanInterval=60 min(s)
14:10:07.300 [Thread[Thread-14,5,main]] INFO org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager -- Updating the current master key for generating delegation tokens
14:10:07.318 [main] INFO com.sun.jersey.api.core.PackagesResourceConfig -- Scanning for root resource and provider classes in the packages:
  org.apache.hadoop.crypto.key.kms.server
14:10:07.334 [main] INFO com.sun.jersey.api.core.ScanningResourceConfig -- Root resource classes found:
  class org.apache.hadoop.crypto.key.kms.server.MetricREST
  class org.apache.hadoop.crypto.key.kms.server.RangerKMSRestApi
  class org.apache.hadoop.crypto.key.kms.server.KMS
14:10:07.334 [main] INFO com.sun.jersey.api.core.ScanningResourceConfig -- Provider classes found:
  class org.apache.hadoop.crypto.key.kms.server.KMSExceptionsProvider
  class org.apache.hadoop.crypto.key.kms.server.KMSJSONReader
  class org.apache.hadoop.crypto.key.kms.server.KMSJSONWriter
14:10:07.398 [main] INFO com.sun.jersey.server.impl.application.WebApplicationImpl -- Initiating Jersey application, version 'Jersey: 1.19.4 05/24/2017 03:46 PM'
14:10:07.628 [main] ERROR com.sun.jersey.server.impl.wadl.WadlApplicationContextImpl -- Error while searching for service [javax.xml.bind.JAXBContextFactory]
javax.xml.bind.JAXBException: Error while searching for service [javax.xml.bind.JAXBContextFactory]
	at javax.xml.bind.ContextFinder$1.createException(ContextFinder.java:118)
	at javax.xml.bind.ContextFinder$1.createException(ContextFinder.java:115)
	at javax.xml.bind.ServiceLoaderUtil.firstByServiceLoader(ServiceLoaderUtil.java:76)
	at javax.xml.bind.ContextFinder.find(ContextFinder.java:343)
	at javax.xml.bind.JAXBContext.newInstance(JAXBContext.java:508)
	at javax.xml.bind.JAXBContext.newInstance(JAXBContext.java:465)
	at javax.xml.bind.JAXBContext.newInstance(JAXBContext.java:366)
	at com.sun.jersey.server.impl.wadl.WadlApplicationContextImpl.<init>(WadlApplicationContextImpl.java:107)
	at com.sun.jersey.server.impl.wadl.WadlFactory.init(WadlFactory.java:100)
	at com.sun.jersey.server.impl.application.RootResourceUriRules.initWadl(RootResourceUriRules.java:169)
	at com.sun.jersey.server.impl.application.RootResourceUriRules.<init>(RootResourceUriRules.java:106)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._initiate(WebApplicationImpl.java:1359)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.access$700(WebApplicationImpl.java:180)
	at com.sun.jersey.server.impl.application.WebApplicationImpl$13.f(WebApplicationImpl.java:799)
	at com.sun.jersey.server.impl.application.WebApplicationImpl$13.f(WebApplicationImpl.java:795)
	at com.sun.jersey.spi.inject.Errors.processWithErrors(Errors.java:193)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.initiate(WebApplicationImpl.java:795)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.initiate(WebApplicationImpl.java:790)
	at com.sun.jersey.spi.container.servlet.ServletContainer.initiate(ServletContainer.java:509)
	at com.sun.jersey.spi.container.servlet.ServletContainer$InternalWebComponent.initiate(ServletContainer.java:339)
	at com.sun.jersey.spi.container.servlet.WebComponent.load(WebComponent.java:605)
	at com.sun.jersey.spi.container.servlet.WebComponent.init(WebComponent.java:207)
	at com.sun.jersey.spi.container.servlet.ServletContainer.init(ServletContainer.java:394)
	at com.sun.jersey.spi.container.servlet.ServletContainer.init(ServletContainer.java:577)
	at javax.servlet.GenericServlet.init(GenericServlet.java:143)
	at org.apache.catalina.core.StandardWrapper.initServlet(StandardWrapper.java:984)
	at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:941)
	at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:838)
	at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:4193)
	at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:4494)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1203)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1193)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:145)
	at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:749)
	at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:721)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1203)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1193)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:145)
	at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:749)
	at org.apache.catalina.core.StandardEngine.startInternal(StandardEngine.java:211)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.StandardService.startInternal(StandardService.java:415)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.StandardServer.startInternal(StandardServer.java:874)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.startup.Tomcat.start(Tomcat.java:438)
	at org.apache.ranger.server.tomcat.EmbeddedServer.startServer(EmbeddedServer.java:351)
	at org.apache.ranger.server.tomcat.EmbeddedServer.start(EmbeddedServer.java:317)
	at org.apache.ranger.server.tomcat.EmbeddedServer.main(EmbeddedServer.java:95)
Caused by: java.util.ServiceConfigurationError: javax.xml.bind.JAXBContextFactory: com.sun.xml.bind.v2.JAXBContextFactory not a subtype
	at java.base/java.util.ServiceLoader.fail(ServiceLoader.java:593)
	at java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNextService(ServiceLoader.java:1244)
	at java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNext(ServiceLoader.java:1273)
	at java.base/java.util.ServiceLoader$2.hasNext(ServiceLoader.java:1309)
	at java.base/java.util.ServiceLoader$3.hasNext(ServiceLoader.java:1393)
	at javax.xml.bind.ServiceLoaderUtil.firstByServiceLoader(ServiceLoaderUtil.java:69)
	... 52 common frames omitted
14:10:07.709 [main] INFO org.apache.coyote.http11.Http11NioProtocol -- Starting ProtocolHandler ["http-nio-34499"]
14:10:07.800 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Got token null from httpRequest http://127.25.254.212:34499/kms/v1/keys
14:10:07.868 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:34499/kms/v1/keys] triggering authentication. handler: class org.apache.hadoop.security.token.delegation.web.KerberosDelegationTokenAuthenticationHandler
14:10:07.868 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationHandler -- Falling back to class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler (req=org.apache.catalina.connector.RequestFacade@708dc2eb)
14:10:07.896 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:34499/kms/v1/keys] user [keyadmin] authenticated
14:10:07.928 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- ==> createKey()
14:10:07.932 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, keyadmin@KRBTEST.COM (auth:KERBEROS), CREATE)
14:10:07.933 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:07.933 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:07.933 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, keyadmin@KRBTEST.COM (auth:KERBEROS), CREATE)
14:10:07.933 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(CREATE, keyadmin@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey)
14:10:07.933 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest)
14:10:07.933 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest): calling childClassLoader.findClass()
14:10:07.933 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest)
14:10:07.933 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest): calling childClassLoader().findClass() 
14:10:07.933 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestImpl)
14:10:07.933 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestImpl): calling childClassLoader.findClass()
14:10:07.933 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestImpl)
14:10:07.933 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestImpl): calling childClassLoader().findClass() 
14:10:07.934 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestImpl): class org.apache.ranger.plugin.policyengine.RangerAccessRequestImpl
14:10:07.934 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestImpl): class org.apache.ranger.plugin.policyengine.RangerAccessRequestImpl
14:10:07.934 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest): class org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest
14:10:07.934 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest): class org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest
14:10:07.934 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceMatchingScope)
14:10:07.934 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceMatchingScope): calling childClassLoader.findClass()
14:10:07.934 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceMatchingScope)
14:10:07.934 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceMatchingScope): calling childClassLoader().findClass() 
14:10:07.935 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceMatchingScope): class org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceMatchingScope
14:10:07.935 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceMatchingScope): class org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceMatchingScope
14:10:07.935 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerAccessRequestUtil)
14:10:07.935 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerAccessRequestUtil): calling childClassLoader.findClass()
14:10:07.935 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerAccessRequestUtil)
14:10:07.935 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerAccessRequestUtil): calling childClassLoader().findClass() 
14:10:07.936 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerAccessRequestUtil): class org.apache.ranger.plugin.util.RangerAccessRequestUtil
14:10:07.936 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerAccessRequestUtil): class org.apache.ranger.plugin.util.RangerAccessRequestUtil
14:10:07.936 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSResource)
14:10:07.936 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSResource): calling childClassLoader.findClass()
14:10:07.936 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSResource)
14:10:07.936 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSResource): calling childClassLoader().findClass() 
14:10:07.936 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResourceImpl)
14:10:07.936 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResourceImpl): calling childClassLoader.findClass()
14:10:07.936 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResourceImpl)
14:10:07.936 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResourceImpl): calling childClassLoader().findClass() 
14:10:07.937 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerMutableResource)
14:10:07.937 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerMutableResource): calling childClassLoader.findClass()
14:10:07.937 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerMutableResource)
14:10:07.937 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerMutableResource): calling childClassLoader().findClass() 
14:10:07.937 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerMutableResource): interface org.apache.ranger.plugin.policyengine.RangerMutableResource
14:10:07.937 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerMutableResource): interface org.apache.ranger.plugin.policyengine.RangerMutableResource
14:10:07.937 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResourceImpl): class org.apache.ranger.plugin.policyengine.RangerAccessResourceImpl
14:10:07.937 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResourceImpl): class org.apache.ranger.plugin.policyengine.RangerAccessResourceImpl
14:10:07.937 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSResource): class org.apache.ranger.authorization.kms.authorizer.RangerKMSResource
14:10:07.937 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSResource): class org.apache.ranger.authorization.kms.authorizer.RangerKMSResource
14:10:07.959 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.security.Groups -- GroupCacheLoader - load.
14:10:07.967 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.security.UserGroupInformation -- Failed to get groups for user keyadmin
java.io.IOException: No groups found for user keyadmin
	at org.apache.hadoop.security.Groups.noGroupsForUser(Groups.java:200)
	at org.apache.hadoop.security.Groups.access$400(Groups.java:75)
	at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:334)
	at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:270)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3529)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2278)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2155)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2045)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache.get(LocalCache.java:3962)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3985)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4946)
	at org.apache.hadoop.security.Groups.getGroups(Groups.java:228)
	at org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1755)
	at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1743)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest.<init>(RangerKmsAuthorizer.java:367)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:247)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:266)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:164)
	at org.apache.hadoop.crypto.key.kms.server.KMS.assertAccess(KMS.java:745)
	at org.apache.hadoop.crypto.key.kms.server.KMS.createKey(KMS.java:121)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:07.968 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.thirdparty.com.google.common.collect.Sets)
14:10:07.968 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.thirdparty.com.google.common.collect.Sets): calling childClassLoader.findClass()
14:10:07.968 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.thirdparty.com.google.common.collect.Sets)
14:10:07.968 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.thirdparty.com.google.common.collect.Sets): calling childClassLoader().findClass() 
14:10:07.968 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.thirdparty.com.google.common.collect.Sets): calling componentClassLoader.findClass()
14:10:07.968 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.thirdparty.com.google.common.collect.Sets): calling componentClassLoader.loadClass()
14:10:07.968 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.thirdparty.com.google.common.collect.Sets): class org.apache.hadoop.thirdparty.com.google.common.collect.Sets
14:10:07.968 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } }, policyType=0)
14:10:07.969 [http-nio-34499-exec-1] INFO org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- RangerPolicyEngineImpl.evaluatePolicies(746aa800_0, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:07.969 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- ==> preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:07.969 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:10:07.969 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:10:07.969 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- getMatchedZonesForResourceAndChildren(resource=RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:10:07.969 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- No context-enrichers!!!
14:10:07.969 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- <== preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:07.969 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0)
14:10:07.969 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- zoneNames:[null]
14:10:07.969 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- zoneName:[null]
14:10:07.969 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:07.969 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:07.969 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResult)
14:10:07.969 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResult): calling childClassLoader.findClass()
14:10:07.969 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResult)
14:10:07.969 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResult): calling childClassLoader().findClass() 
14:10:07.970 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResult): class org.apache.ranger.plugin.policyengine.RangerAccessResult
14:10:07.970 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResult): class org.apache.ranger.plugin.policyengine.RangerAccessResult
14:10:07.970 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$1)
14:10:07.971 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$1): calling childClassLoader.findClass()
14:10:07.971 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$1)
14:10:07.971 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$1): calling childClassLoader().findClass() 
14:10:07.971 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$1): class org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$1
14:10:07.971 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$1): class org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$1
14:10:07.971 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.NoSuchFieldError)
14:10:07.971 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.NoSuchFieldError): calling childClassLoader.findClass()
14:10:07.971 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.NoSuchFieldError): class java.lang.NoSuchFieldError
14:10:07.971 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:07.971 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:07.971 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.setAuditEnabledFromCache()
14:10:07.971 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.setAuditEnabledFromCache():false
14:10:07.971 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever)
14:10:07.971 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever): calling childClassLoader.findClass()
14:10:07.971 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever)
14:10:07.971 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever): calling childClassLoader().findClass() 
14:10:07.972 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever): class org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever
14:10:07.972 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever): class org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever
14:10:07.972 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:10:07.972 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector)
14:10:07.972 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector): calling childClassLoader.findClass()
14:10:07.972 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector)
14:10:07.972 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector): calling childClassLoader().findClass() 
14:10:07.973 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector): class org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector
14:10:07.973 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector): class org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector
14:10:07.973 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:10:07.973 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchingScope)
14:10:07.973 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchingScope): calling childClassLoader.findClass()
14:10:07.973 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchingScope)
14:10:07.973 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchingScope): calling childClassLoader().findClass() 
14:10:07.973 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchingScope): class org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchingScope
14:10:07.973 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchingScope): class org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchingScope
14:10:07.973 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-34499-exec-1:RangerResourceTrie.traverse(resource=kuduclusterkey):443799:446490
14:10:07.973 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@8e54b84
14:10:07.973 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@5a92a871]]
14:10:07.973 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-34499-exec-1:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):1859571:1871162
14:10:07.973 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=1
14:10:07.973 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:07 UTC 2026)
14:10:07.973 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:07 UTC 2026) : true
14:10:07.973 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:07.974 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerTagAccessRequest)
14:10:07.974 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerTagAccessRequest): calling childClassLoader.findClass()
14:10:07.974 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerTagAccessRequest)
14:10:07.974 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerTagAccessRequest): calling childClassLoader().findClass() 
14:10:07.974 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerTagAccessRequest): class org.apache.ranger.plugin.policyengine.RangerTagAccessRequest
14:10:07.974 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerTagAccessRequest): class org.apache.ranger.plugin.policyengine.RangerTagAccessRequest
14:10:07.974 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:07.974 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher$MatchType)
14:10:07.974 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher$MatchType): calling childClassLoader.findClass()
14:10:07.974 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher$MatchType)
14:10:07.974 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher$MatchType): calling childClassLoader().findClass() 
14:10:07.974 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher$MatchType): class org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher$MatchType
14:10:07.974 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher$MatchType): class org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher$MatchType
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchType)
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchType): calling childClassLoader.findClass()
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchType)
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchType): calling childClassLoader().findClass() 
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchType): class org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchType
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchType): class org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchType
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- isAllValuesRequested(kuduclusterkey): false
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$1)
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$1): calling childClassLoader.findClass()
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$1)
14:10:07.975 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$1): calling childClassLoader().findClass() 
14:10:07.976 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$1): class org.apache.ranger.plugin.resourcematcher.ResourceMatcher$1
14:10:07.976 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$1): class org.apache.ranger.plugin.resourcematcher.ResourceMatcher$1
14:10:07.976 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): true
14:10:07.976 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.policyresourcematcher.match -- [PERF]:http-nio-34499-exec-1:RangerDefaultPolicyResourceMatcher.getMatchType():1285655:1284599
14:10:07.976 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:07.976 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.matchPolicyCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:07.976 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.matchCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): true
14:10:07.976 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}, SELF)
14:10:07.976 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- Checking for accessType:[create]
14:10:07.976 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper)
14:10:07.976 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper): calling childClassLoader.findClass()
14:10:07.976 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper)
14:10:07.976 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper): calling childClassLoader().findClass() 
14:10:07.977 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper): class org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper
14:10:07.977 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper): class org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper
14:10:07.977 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3a796708)
14:10:07.977 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3a796708): null
14:10:07.977 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3a796708)
14:10:07.977 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3a796708)
14:10:07.977 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3a796708)
14:10:07.977 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, keyadmin, [], null, null)
14:10:07.977 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, keyadmin, [], null, null): true
14:10:07.977 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3a796708): true
14:10:07.977 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3a796708)
14:10:07.977 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3a796708): true
14:10:07.977 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-34499-exec-1:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):303743:304106
14:10:07.977 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3a796708): true
14:10:07.977 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3a796708): org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator@71a62686
14:10:07.977 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:07.977 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:07.977 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{create=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF)
14:10:07.977 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.policy.request -- [PERF]:http-nio-34499-exec-1:RangerPolicyEvaluator.evaluate(requestHashCode=746aa800,policyId=3, policyName=all):3952974:3952762
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{create=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.storeAuditEnabledInCache()
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.storeAuditEnabledInCache()
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.policyengine.request -- [PERF]:http-nio-34499-exec-1:RangerPolicyEngine.evaluatePolicies(requestHashCode=746aa800_0):9233740:9246291
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType=0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-34499-exec-1:RangerResourceTrie.traverse(resource=kuduclusterkey):18323:18981
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@13fec0da
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@33fedecb, org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@54f0041f]]
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-34499-exec-1:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):202537:203249
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=2
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$1)
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$1): calling childClassLoader.findClass()
14:10:07.978 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$1)
14:10:07.979 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$1): calling childClassLoader().findClass() 
14:10:07.979 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$1): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$1
14:10:07.979 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$1): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$1
14:10:07.979 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:07.979 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:07.979 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:07.979 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:07.979 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:07.979 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:07.979 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:07.979 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:10:07.979 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchUserGroupRole(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=true
14:10:07.979 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAction(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=true
14:10:07.979 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:10:07.979 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:07.979 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:07.979 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:07 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:10:07.980 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:07.980 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:07.980 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:07.980 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): null
14:10:07.980 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.logAuthzAudit(null)
14:10:07.980 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.logAuthzAudit(null)
14:10:07.980 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:07.980 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.hasAccess(CREATE, keyadmin@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey): true
14:10:07.980 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:07.980 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:07.980 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.assertAccess(kuduclusterkey, keyadmin@KRBTEST.COM (auth:KERBEROS), CREATE)
14:10:07.980 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- Creating key: name=kuduclusterkey, cipher=AES/CTR/NoPadding, keyLength=128, description=kuduclusterkey
14:10:07.981 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: keyadmin@KRBTEST.COM (auth:KERBEROS)][action: org.apache.hadoop.crypto.key.kms.server.KMS$$Lambda$229/0x00007f6bcc63fa80@42ceeea3]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.hadoop.crypto.key.kms.server.KMS.createKey(KMS.java:149)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:07.983 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:10:07.983 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:07.983 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:07.983 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:07.983 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:07.983 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:10:08.002 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, keyadmin@KRBTEST.COM (auth:KERBEROS), MANAGEMENT)
14:10:08.003 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:08.003 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:08.003 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, keyadmin@KRBTEST.COM (auth:KERBEROS), MANAGEMENT)
14:10:08.003 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:08.003 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:08.003 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, keyadmin@KRBTEST.COM (auth:KERBEROS), MANAGEMENT)
14:10:08.003 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> createKey(kuduclusterkey)
14:10:08.003 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> reloadKeys()
14:10:08.003 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> loadKeys()
14:10:08.003 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> engineLoad()
14:10:08.003 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> dbOperationLoad()
14:10:08.005 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== dbOperationLoad(): count=0
14:10:08.005 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- RangerKeyStore might be null or key is not present in the database.
14:10:08.005 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== loadKeys()
14:10:08.005 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== reloadKeys()
14:10:08.005 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== engineContainsAlias(kuduclusterkey): ret=false
14:10:08.006 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> innerSetKeyVersion(name=kuduclusterkey, versionName=kuduclusterkey@0)
14:10:08.007 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> addKeyEntry(kuduclusterkey)
14:10:08.008 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> sealKey()
14:10:08.020 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== sealKey(): ret=org.apache.hadoop.crypto.key.RangerKeyStore$RangerSealedObject@414a1a2d
14:10:08.020 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== addKeyEntry(kuduclusterkey)
14:10:08.021 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> addKeyEntry(kuduclusterkey@0)
14:10:08.021 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> sealKey()
14:10:08.024 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== sealKey(): ret=org.apache.hadoop.crypto.key.RangerKeyStore$RangerSealedObject@4d4d8432
14:10:08.024 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== addKeyEntry(kuduclusterkey@0)
14:10:08.025 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== innerSetKeyVersion(name=kuduclusterkey, versionName=kuduclusterkey@0): ret=key(kuduclusterkey@0)= 80 b3 63 8c 80 4f e2 d8 1f 63 a6 04 73 be ff 48
14:10:08.025 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== createKey(kuduclusterkey)
14:10:08.025 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> flush()
14:10:08.025 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> engineStore()
14:10:08.035 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> dbOperationStore(kuduclusterkey)
14:10:08.058 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== dbOperationStore(kuduclusterkey)
14:10:08.059 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> dbOperationStore(kuduclusterkey@0)
14:10:08.065 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== dbOperationStore(kuduclusterkey@0)
14:10:08.065 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== engineStore()
14:10:08.065 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> reloadKeys()
14:10:08.065 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> loadKeys()
14:10:08.065 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> engineLoad()
14:10:08.065 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> dbOperationLoad()
14:10:08.067 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== dbOperationLoad(): count=2
14:10:08.072 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- engineLoad(): loaded key kuduclusterkey
14:10:08.073 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- engineLoad(): loaded key kuduclusterkey@0
14:10:08.073 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- engineLoad(): loaded 2 keys
14:10:08.073 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- engineLoad(): keyEntries switched with 2 keys
14:10:08.073 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== loadKeys()
14:10:08.073 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== reloadKeys()
14:10:08.073 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== flush()
14:10:08.075 [http-nio-34499-exec-1] INFO kms-audit -- OK[op=CREATE_KEY, key=kuduclusterkey, user=keyadmin@KRBTEST.COM] UserProvidedMaterial:false Description:kuduclusterkey
14:10:08.075 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(GET, keyadmin@KRBTEST.COM (auth:KERBEROS))
14:10:08.075 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:08.075 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:08.075 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(GET, keyadmin@KRBTEST.COM (auth:KERBEROS))
14:10:08.076 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.security.UserGroupInformation -- Failed to get groups for user keyadmin
java.io.IOException: No groups found for user keyadmin
	at org.apache.hadoop.security.Groups.noGroupsForUser(Groups.java:200)
	at org.apache.hadoop.security.Groups.getGroups(Groups.java:223)
	at org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1755)
	at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1743)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest.<init>(RangerKmsAuthorizer.java:367)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:221)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:143)
	at org.apache.hadoop.crypto.key.kms.server.KMS.createKey(KMS.java:165)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:08.077 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } }, policyType=0)
14:10:08.077 [http-nio-34499-exec-1] INFO org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- RangerPolicyEngineImpl.evaluatePolicies(6f874dcf_0, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:08.077 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- ==> preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:08.077 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=; } })
14:10:08.077 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=; } }): ret=null
14:10:08.077 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- getMatchedZonesForResourceAndChildren(resource=RangerResourceImpl={ownerUser={null} elements={keyname=; } }): ret=null
14:10:08.077 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- No context-enrichers!!!
14:10:08.077 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- <== preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:08.077 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0)
14:10:08.077 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- zoneNames:[null]
14:10:08.078 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- zoneName:[null]
14:10:08.078 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:08.078 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:08.078 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:08.078 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:08.078 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.setAuditEnabledFromCache()
14:10:08.078 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.setAuditEnabledFromCache():false
14:10:08.078 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:08 UTC 2026)
14:10:08.078 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:08 UTC 2026) : true
14:10:08.078 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:08.078 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=; } }{token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:08.078 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=; } })
14:10:08.078 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:10:08.078 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:10:08.078 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:10:08.078 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:10:08.078 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=; } }): [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.isMatch(, {token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.getMatchType(, {token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- isAllValuesRequested(): true
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.getMatchType(, {token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.isMatch(, {token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): true
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.policyresourcematcher.match -- [PERF]:http-nio-34499-exec-1:RangerDefaultPolicyResourceMatcher.getMatchType():406806:532653
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=; } }{token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.matchPolicyCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.matchCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): true
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}, SELF)
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- Checking for accessType:[get]
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@54502fd6)
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@54502fd6): null
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@54502fd6)
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@54502fd6)
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@54502fd6)
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, keyadmin, [], null, null)
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, keyadmin, [], null, null): true
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@54502fd6): true
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@54502fd6)
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@54502fd6): true
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-34499-exec-1:RangerPolicyItemEvaluator.isMatch(resource=null):147368:190839
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@54502fd6): true
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@54502fd6): org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator@71a62686
14:10:08.079 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:08.080 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:08.080 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{get=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF)
14:10:08.080 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.policy.request -- [PERF]:http-nio-34499-exec-1:RangerPolicyEvaluator.evaluate(requestHashCode=6f874dcf,policyId=3, policyName=all):1185131:1523796
14:10:08.080 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{get=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:08.080 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.storeAuditEnabledInCache()
14:10:08.080 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.storeAuditEnabledInCache()
14:10:08.080 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:08.080 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:08.080 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:08.080 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.policyengine.request -- [PERF]:http-nio-34499-exec-1:RangerPolicyEngine.evaluatePolicies(requestHashCode=6f874dcf_0):2250874:3015280
14:10:08.080 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType=0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:08.080 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:08.080 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:08.080 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:08.080 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:08.080 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:08.080 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:08.080 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:08.080 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchUserGroupRole(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=true
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAction(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=true
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): null
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.logAuthzAudit(null)
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.logAuthzAudit(null)
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.kmsauth.request -- [PERF]:http-nio-34499-exec-1:RangerKmsAuthorizer.hasAccess(type=GET):4225474:5823402
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.hasAccess(GET, keyadmin@KRBTEST.COM (auth:KERBEROS)): true
14:10:08.081 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:08.082 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:08.082 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccess(GET, keyadmin@KRBTEST.COM (auth:KERBEROS))
14:10:08.086 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- <== createKey()
May 04 14:10:08 dist-test-slave-2x32 krb5kdc[5621](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903808, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
WARNING: no policy specified for kudu/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:10:08.179237 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:37897
--webserver_interface=127.25.254.254
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:44557
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:37897
--ranger_config_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-client
--trusted_user_acl=test-admin
--encrypt_data_at_rest=true
--encryption_key_provider=ranger-kms
--encryption_cluster_key_name=kuduclusterkey
--ranger_kms_url=127.25.254.212:34499/kms
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:10:08.342536  6359 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:10:08.342898  6359 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:10:08.342976  6359 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:10:08.347764  6359 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:10:08.347867  6359 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:10:08.347899  6359 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:10:08.347923  6359 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:10:08.347946  6359 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:10:08.356266  6359 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:44557
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--encryption_cluster_key_name=kuduclusterkey
--encryption_key_provider=ranger-kms
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/master-0/wal
--ranger_kms_url=127.25.254.212:34499/kms
--trusted_user_acl=<redacted>
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:37897
--ranger_config_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-client
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:37897
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.6359
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:10:08.357918  6359 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:10:08.359203  6359 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:10:08.367645  6364 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:10:08.367836  6359 server_base.cc:1061] running on GCE node
W20260504 14:10:08.368171  6367 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:10:08.368285  6365 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:10:08.369148  6359 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:10:08.370501  6359 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:10:08.371708  6359 hybrid_clock.cc:648] HybridClock initialized: now 1777903808371674 us; error 56 us; skew 500 ppm
May 04 14:10:08 dist-test-slave-2x32 krb5kdc[5621](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903808, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:10:08.376008  6359 init.cc:377] Logged in from keytab as kudu/127.25.254.254@KRBTEST.COM (short username kudu)
I20260504 14:10:08.377521  6359 webserver.cc:492] Webserver started at http://127.25.254.254:37459/ using document root <none> and password file <none>
I20260504 14:10:08.378293  6359 fs_manager.cc:362] Metadata directory not provided
I20260504 14:10:08.378371  6359 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:10:08.378602  6359 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
May 04 14:10:08 dist-test-slave-2x32 krb5kdc[5621](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903808, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for HTTP/127.25.254.212@KRBTEST.COM
14:10:08.382 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Got token null from httpRequest http://127.25.254.212:34499/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1
14:10:08.391 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:34499/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1] triggering authentication. handler: class org.apache.hadoop.security.token.delegation.web.KerberosDelegationTokenAuthenticationHandler
14:10:08.391 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationHandler -- Falling back to class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler (req=org.apache.catalina.connector.RequestFacade@9efc354)
14:10:08.395 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:34499/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1] user [kudu] authenticated
14:10:08.397 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- ==> generateEncryptedKeys(name=kuduclusterkey, eekOp=generate, numKeys=1)
14:10:08.397 [http-nio-34499-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:08.397 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:08.397 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:08.398 [http-nio-34499-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:08.398 [http-nio-34499-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(GENERATE_EEK, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey)
14:10:08.398 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.security.Groups -- GroupCacheLoader - load.
14:10:08.398 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.security.UserGroupInformation -- Failed to get groups for user kudu
java.io.IOException: No groups found for user kudu
	at org.apache.hadoop.security.Groups.noGroupsForUser(Groups.java:200)
	at org.apache.hadoop.security.Groups.access$400(Groups.java:75)
	at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:334)
	at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:270)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3529)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2278)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2155)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2045)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache.get(LocalCache.java:3962)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3985)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4946)
	at org.apache.hadoop.security.Groups.getGroups(Groups.java:228)
	at org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1755)
	at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1743)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest.<init>(RangerKmsAuthorizer.java:367)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:247)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:266)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:164)
	at org.apache.hadoop.crypto.key.kms.server.KMS.assertAccess(KMS.java:745)
	at org.apache.hadoop.crypto.key.kms.server.KMS.generateEncryptedKeys(KMS.java:529)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:08.398 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } }, policyType=0)
14:10:08.398 [http-nio-34499-exec-3] INFO org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- RangerPolicyEngineImpl.evaluatePolicies(222af370_0, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:08.398 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- ==> preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- getMatchedZonesForResourceAndChildren(resource=RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- No context-enrichers!!!
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- <== preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0)
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- zoneNames:[null]
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- zoneName:[null]
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.setAuditEnabledFromCache()
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.setAuditEnabledFromCache():false
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-34499-exec-3:RangerResourceTrie.traverse(resource=kuduclusterkey):15364:16068
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@55287102
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@5a92a871]]
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-34499-exec-3:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):128026:127824
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=1
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:08 UTC 2026)
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:08 UTC 2026) : true
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:10:08.399 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- isAllValuesRequested(kuduclusterkey): false
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): true
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.perf.policyresourcematcher.match -- [PERF]:http-nio-34499-exec-3:RangerDefaultPolicyResourceMatcher.getMatchType():280297:281022
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.matchPolicyCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.matchCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): true
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}, SELF)
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- Checking for accessType:[generateeek]
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@26ed4331)
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@26ed4331): null
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@26ed4331)
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@26ed4331)
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@26ed4331)
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null)
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null): false
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@26ed4331): false
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-34499-exec-3:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):124255:124096
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@26ed4331): false
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@26ed4331)
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@26ed4331)
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null)
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null): true
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@26ed4331): true
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@26ed4331)
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@26ed4331): true
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-34499-exec-3:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):115949:116142
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@26ed4331): true
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@26ed4331): org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator@6cf7d42f
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{generateeek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF)
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.perf.policy.request -- [PERF]:http-nio-34499-exec-3:RangerPolicyEvaluator.evaluate(requestHashCode=222af370,policyId=3, policyName=all):1093278:1093838
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{generateeek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.storeAuditEnabledInCache()
14:10:08.400 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.storeAuditEnabledInCache()
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.perf.policyengine.request -- [PERF]:http-nio-34499-exec-3:RangerPolicyEngine.evaluatePolicies(requestHashCode=222af370_0):2043145:2215846
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType=0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-34499-exec-3:RangerResourceTrie.traverse(resource=kuduclusterkey):12959:13517
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@19b30bb2
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@33fedecb, org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@54f0041f]]
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-34499-exec-3:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):218134:219595
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=2
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchUserGroupRole(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=false
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.401 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.402 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- generateNextAuditEventId(): b6caadd3-a810-48a3-a13b-1ded6e6de8f1-0
14:10:08.402 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={b6caadd3-a810-48a3-a13b-1ded6e6de8f1-0} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:08 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-0;seq_num=0;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null}
14:10:08.402 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:08 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-0;seq_num=0;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:10:08.402 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:08 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-0;seq_num=1;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:10:08.402 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={b6caadd3-a810-48a3-a13b-1ded6e6de8f1-0} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.402 [http-nio-34499-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.hasAccess(GENERATE_EEK, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey): true
14:10:08.402 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:08.402 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:08.402 [http-nio-34499-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:08.404 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS)][action: org.apache.hadoop.crypto.key.kms.server.KMS$$Lambda$240/0x00007f6bcc64e598@3884f191]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.hadoop.crypto.key.kms.server.KMS.generateEncryptedKeys(KMS.java:530)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:08.408 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getMetadata(kuduclusterkey)
14:10:08.408 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== engineContainsAlias(kuduclusterkey): ret=true
14:10:08.408 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== engineContainsAlias(kuduclusterkey): ret=true
14:10:08.408 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> engineGetKey(kuduclusterkey)
14:10:08.408 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> unsealKey()
14:10:08.413 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== unsealKey(): ret=org.apache.hadoop.crypto.key.RangerKeyStoreProvider$KeyMetadata@46a16608
14:10:08.413 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== engineGetKey(kuduclusterkey): ret=org.apache.hadoop.crypto.key.RangerKeyStoreProvider$KeyMetadata@46a16608
14:10:08.414 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getMetadata(kuduclusterkey): ret=cipher: AES/CTR/NoPadding, length: 128, description: kuduclusterkey, created: Mon May 04 14:10:08 UTC 2026, version: 1, attributes: [key.acl.name=kuduclusterkey] 
14:10:08.414 [http-nio-34499-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:10:08.414 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:08.414 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:08.414 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:08.414 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:08.414 [http-nio-34499-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:10:08.414 [http-nio-34499-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:08.414 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:08.414 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:08.414 [http-nio-34499-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:08.414 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:08.414 [http-nio-34499-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:08.415 [http-nio-34499-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:08.415 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getMetadata(kuduclusterkey)
14:10:08.415 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getMetadata(kuduclusterkey): ret=cipher: AES/CTR/NoPadding, length: 128, description: kuduclusterkey, created: Mon May 04 14:10:08 UTC 2026, version: 1, attributes: [key.acl.name=kuduclusterkey] 
14:10:08.415 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getKeyVersion(kuduclusterkey@0)
14:10:08.415 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== engineContainsAlias(kuduclusterkey@0): ret=true
14:10:08.415 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> engineGetKey(kuduclusterkey@0)
14:10:08.415 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> unsealKey()
14:10:08.416 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== unsealKey(): ret=javax.crypto.spec.SecretKeySpec@d4aa9fa9
14:10:08.416 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== engineGetKey(kuduclusterkey@0): ret=javax.crypto.spec.SecretKeySpec@d4aa9fa9
14:10:08.416 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getKeyVersion(kuduclusterkey@0)
14:10:08.425 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.NativeCodeLoader -- Trying to load the custom-built native-hadoop library...
14:10:08.426 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.NativeCodeLoader -- Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path: /tmp/dist-test-taskMMfo7I/build/dist-test-system-libs/:/tmp/dist-test-taskMMfo7I/build/debug/lib:/usr/java/packages/lib:/usr/lib64:/lib64:/lib:/usr/lib
14:10:08.426 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.NativeCodeLoader -- java.library.path=/tmp/dist-test-taskMMfo7I/build/dist-test-system-libs/:/tmp/dist-test-taskMMfo7I/build/debug/lib:/usr/java/packages/lib:/usr/lib64:/lib64:/lib:/usr/lib
14:10:08.426 [http-nio-34499-exec-3] WARN org.apache.hadoop.util.NativeCodeLoader -- Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14:10:08.426 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.OpensslCipher -- Failed to load OpenSSL Cipher.
java.lang.UnsatisfiedLinkError: 'boolean org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl()'
	at org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl(Native Method)
	at org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:85)
	at org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500)
	at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:481)
	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
	at org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:69)
	at org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:102)
	at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension$DefaultCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:299)
	at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:518)
	at org.apache.hadoop.crypto.key.kms.server.EagerKeyGeneratorKeyProviderCryptoExtension$CryptoExtension$EncryptedQueueRefiller.fillQueueForKey(EagerKeyGeneratorKeyProviderCryptoExtension.java:76)
	at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:249)
	at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:243)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3529)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2278)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2155)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2045)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache.get(LocalCache.java:3962)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3985)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4946)
	at org.apache.hadoop.crypto.key.kms.ValueQueue.getAtMost(ValueQueue.java:353)
	at org.apache.hadoop.crypto.key.kms.ValueQueue.getNext(ValueQueue.java:293)
	at org.apache.hadoop.crypto.key.kms.server.EagerKeyGeneratorKeyProviderCryptoExtension$CryptoExtension.generateEncryptedKey(EagerKeyGeneratorKeyProviderCryptoExtension.java:125)
	at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:518)
	at org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider.generateEncryptedKey(KeyAuthorizationKeyProvider.java:175)
	at org.apache.hadoop.crypto.key.kms.server.KMS.lambda$generateEncryptedKeys$10(KMS.java:532)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
	at java.base/javax.security.auth.Subject.doAs(Subject.java:439)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
	at org.apache.hadoop.crypto.key.kms.server.KMS.generateEncryptedKeys(KMS.java:530)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:08.427 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.428 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.430 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.430 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.430 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.431 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.431 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.431 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.432 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.432 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.432 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.432 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.433 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.433 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.433 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.433 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.434 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.434 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.434 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.434 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.435 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.435 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.435 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.435 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.436 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.436 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.436 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.436 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.437 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.437 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.438 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.438 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.439 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.439 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.439 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.439 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.440 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.440 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.440 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.440 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.441 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.441 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.441 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.441 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.442 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.442 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.442 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.442 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.443 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.443 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.443 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.443 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.444 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.444 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.444 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.444 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.445 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.445 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.445 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.445 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.449 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.449 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.450 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.450 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.450 [http-nio-34499-exec-3] INFO kms-audit -- OK[op=GENERATE_EEK, key=kuduclusterkey, user=kudu/127.25.254.254@KRBTEST.COM, accessCount=1, interval=0ms] 
14:10:08.450 [http-nio-34499-exec-3] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- <== generateEncryptedKeys(name=kuduclusterkey, eekOp=generate, numKeys=1)
I20260504 14:10:08.455971  6359 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "0a97dbe8ab0646d5b4416094cac5d927"
format_stamp: "Formatted at 2026-05-04 14:10:08 on dist-test-slave-2x32"
server_key: "23f3039e4fd95a225e670fb9034ef41d"
server_key_iv: "09211ce5f1c14615a501872cc4d8e71f"
server_key_version: "kuduclusterkey@0"
I20260504 14:10:08.456625  6359 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "0a97dbe8ab0646d5b4416094cac5d927"
format_stamp: "Formatted at 2026-05-04 14:10:08 on dist-test-slave-2x32"
server_key: "23f3039e4fd95a225e670fb9034ef41d"
server_key_iv: "09211ce5f1c14615a501872cc4d8e71f"
server_key_version: "kuduclusterkey@0"
I20260504 14:10:08.461910  6359 fs_manager.cc:696] Time spent creating directory manager: real 0.005s	user 0.006s	sys 0.000s
14:10:08.462 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.462 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.464 [http-nio-34499-exec-4] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Got token null from httpRequest http://127.25.254.212:34499/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt
14:10:08.465 [http-nio-34499-exec-4] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:34499/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt] triggering authentication. handler: class org.apache.hadoop.security.token.delegation.web.KerberosDelegationTokenAuthenticationHandler
14:10:08.465 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.465 [http-nio-34499-exec-4] DEBUG org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationHandler -- Falling back to class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler (req=org.apache.catalina.connector.RequestFacade@20c9fefd)
14:10:08.465 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.465 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.465 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.465 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.466 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.466 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.466 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.466 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.466 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.467 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.467 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.467 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.467 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.468 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.468 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.468 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.468 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.468 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.468 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.469 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.469 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.469 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.469 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.469 [http-nio-34499-exec-4] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:34499/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt] user [kudu] authenticated
14:10:08.469 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.470 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.470 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.470 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.470 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.470 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.471 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.471 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.471 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.471 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.471 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.471 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.472 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.472 [http-nio-34499-exec-4] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- ==> handleEncryptedKeyOp(versionName=kuduclusterkey@0, eekOp=decrypt)
14:10:08.472 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.472 [http-nio-34499-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:08.472 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:08.472 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.472 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:08.472 [http-nio-34499-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:08.472 [http-nio-34499-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(DECRYPT_EEK, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey)
14:10:08.473 [http-nio-34499-exec-4] DEBUG org.apache.hadoop.security.UserGroupInformation -- Failed to get groups for user kudu
java.io.IOException: No groups found for user kudu
	at org.apache.hadoop.security.Groups.noGroupsForUser(Groups.java:200)
	at org.apache.hadoop.security.Groups.getGroups(Groups.java:223)
	at org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1755)
	at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1743)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest.<init>(RangerKmsAuthorizer.java:367)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:247)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:266)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:164)
	at org.apache.hadoop.crypto.key.kms.server.KMS.assertAccess(KMS.java:745)
	at org.apache.hadoop.crypto.key.kms.server.KMS.handleEncryptedKeyOp(KMS.java:664)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:08.473 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } }, policyType=0)
14:10:08.473 [http-nio-34499-exec-4] INFO org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- RangerPolicyEngineImpl.evaluatePolicies(623ba2_0, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:08.474 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- ==> preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:08.474 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:10:08.474 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:10:08.474 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- getMatchedZonesForResourceAndChildren(resource=RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:10:08.474 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- No context-enrichers!!!
14:10:08.474 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- <== preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:08.474 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0)
14:10:08.474 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- zoneNames:[null]
14:10:08.474 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- zoneName:[null]
14:10:08.474 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:08.474 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:08.474 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:08.475 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:08.475 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.setAuditEnabledFromCache()
14:10:08.475 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.setAuditEnabledFromCache():false
14:10:08.475 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:10:08.475 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:10:08.475 [http-nio-34499-exec-4] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-34499-exec-4:RangerResourceTrie.traverse(resource=kuduclusterkey):16315:17117
14:10:08.475 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@5e96a59e
14:10:08.475 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@5a92a871]]
14:10:08.475 [http-nio-34499-exec-4] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-34499-exec-4:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):194551:366457
14:10:08.475 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=1
14:10:08.475 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:08 UTC 2026)
14:10:08.475 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:08 UTC 2026) : true
14:10:08.475 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:08.476 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:08.476 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:10:08.476 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:10:08.476 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:10:08.476 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:10:08.476 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:10:08.476 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:10:08.476 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:08.476 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:08.476 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- isAllValuesRequested(kuduclusterkey): false
14:10:08.476 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:08.476 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): true
14:10:08.476 [http-nio-34499-exec-4] DEBUG org.apache.ranger.perf.policyresourcematcher.match -- [PERF]:http-nio-34499-exec-4:RangerDefaultPolicyResourceMatcher.getMatchType():414423:800850
14:10:08.477 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:08.477 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.matchPolicyCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:08.477 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.matchCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): true
14:10:08.477 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}, SELF)
14:10:08.477 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- Checking for accessType:[decrypteek]
14:10:08.477 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@782dce3c)
14:10:08.477 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@782dce3c): null
14:10:08.477 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@782dce3c)
14:10:08.472 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.477 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@782dce3c)
14:10:08.477 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@782dce3c)
14:10:08.477 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null)
14:10:08.477 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null): false
14:10:08.477 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@782dce3c): false
14:10:08.477 [http-nio-34499-exec-4] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-34499-exec-4:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):119633:196199
14:10:08.477 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@782dce3c): false
14:10:08.477 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@782dce3c)
14:10:08.478 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@782dce3c)
14:10:08.478 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null)
14:10:08.478 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null): true
14:10:08.478 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@782dce3c): true
14:10:08.478 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@782dce3c)
14:10:08.478 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@782dce3c): true
14:10:08.478 [http-nio-34499-exec-4] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-34499-exec-4:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):161579:296243
14:10:08.478 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@782dce3c): true
14:10:08.478 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@782dce3c): org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator@6cf7d42f
14:10:08.478 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:08.478 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:08.478 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{decrypteek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF)
14:10:08.478 [http-nio-34499-exec-4] DEBUG org.apache.ranger.perf.policy.request -- [PERF]:http-nio-34499-exec-4:RangerPolicyEvaluator.evaluate(requestHashCode=623ba2,policyId=3, policyName=all):1537416:2645532
14:10:08.478 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{decrypteek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:08.478 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.storeAuditEnabledInCache()
14:10:08.478 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.storeAuditEnabledInCache()
14:10:08.478 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:08.478 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:08.478 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:08.479 [http-nio-34499-exec-4] DEBUG org.apache.ranger.perf.policyengine.request -- [PERF]:http-nio-34499-exec-4:RangerPolicyEngine.evaluatePolicies(requestHashCode=623ba2_0):2852584:5114654
14:10:08.479 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType=0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:08.479 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:08.479 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:08.479 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:10:08.479 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:10:08.479 [http-nio-34499-exec-4] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-34499-exec-4:RangerResourceTrie.traverse(resource=kuduclusterkey):15090:15879
14:10:08.479 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@f1cf1e0
14:10:08.479 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@33fedecb, org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@54f0041f]]
14:10:08.479 [http-nio-34499-exec-4] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-34499-exec-4:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):199181:303179
14:10:08.479 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=2
14:10:08.479 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:08.479 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:08.479 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:08.479 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:08.479 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:08.480 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:08.480 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:08.480 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:08.480 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.480 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.480 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:10:08.480 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchUserGroupRole(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=false
14:10:08.480 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:10:08.480 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.480 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.480 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:08 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:10:08.480 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.480 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.480 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.480 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- generateNextAuditEventId(): b6caadd3-a810-48a3-a13b-1ded6e6de8f1-1
14:10:08.480 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={b6caadd3-a810-48a3-a13b-1ded6e6de8f1-1} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:08 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-1;seq_num=2;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null}
14:10:08.480 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:08 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-1;seq_num=2;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:10:08.480 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:08 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-1;seq_num=3;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:10:08.481 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={b6caadd3-a810-48a3-a13b-1ded6e6de8f1-1} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:08.481 [http-nio-34499-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.hasAccess(DECRYPT_EEK, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey): true
14:10:08.481 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:08.481 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:08.481 [http-nio-34499-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:08.481 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.481 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.482 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.482 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.482 [http-nio-34499-exec-4] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS)][action: org.apache.hadoop.crypto.key.kms.server.KMS$$Lambda$241/0x00007f6bcc654208@b528a4d]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.hadoop.crypto.key.kms.server.KMS.handleEncryptedKeyOp(KMS.java:666)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:08.483 [http-nio-34499-exec-4] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getKeyVersion(kuduclusterkey@0)
14:10:08.483 [http-nio-34499-exec-4] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== engineContainsAlias(kuduclusterkey@0): ret=true
14:10:08.483 [http-nio-34499-exec-4] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> engineGetKey(kuduclusterkey@0)
14:10:08.483 [http-nio-34499-exec-4] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> unsealKey()
14:10:08.484 [http-nio-34499-exec-4] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== unsealKey(): ret=javax.crypto.spec.SecretKeySpec@d4aa9fa9
14:10:08.484 [http-nio-34499-exec-4] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== engineGetKey(kuduclusterkey@0): ret=javax.crypto.spec.SecretKeySpec@d4aa9fa9
14:10:08.484 [http-nio-34499-exec-4] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getKeyVersion(kuduclusterkey@0)
14:10:08.485 [http-nio-34499-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:10:08.485 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:08.485 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:08.485 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:08.485 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:08.485 [http-nio-34499-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:10:08.485 [http-nio-34499-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:08.485 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:08.485 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:08.485 [http-nio-34499-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:08.485 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:08.485 [http-nio-34499-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:08.485 [http-nio-34499-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:08.485 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.485 [http-nio-34499-exec-4] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.486 [http-nio-34499-exec-4] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.486 [http-nio-34499-exec-4] INFO kms-audit -- OK[op=DECRYPT_EEK, key=kuduclusterkey, user=kudu/127.25.254.254@KRBTEST.COM, accessCount=1, interval=0ms] 
14:10:08.486 [http-nio-34499-exec-4] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- <== handleEncryptedKeyOp(versionName=kuduclusterkey@0, eekOp=decrypt)
14:10:08.486 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.490 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.491 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.491 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.491 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.492 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.492 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.492 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.493 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.493 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.493 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.493 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.494 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
I20260504 14:10:08.494022  6376 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
14:10:08.494 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.494 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.494 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.495 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.495 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.495 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
I20260504 14:10:08.495765  6359 fs_manager.cc:730] Time spent opening block manager: real 0.006s	user 0.003s	sys 0.000s
14:10:08.495 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
I20260504 14:10:08.495922  6359 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "0a97dbe8ab0646d5b4416094cac5d927"
format_stamp: "Formatted at 2026-05-04 14:10:08 on dist-test-slave-2x32"
server_key: "23f3039e4fd95a225e670fb9034ef41d"
server_key_iv: "09211ce5f1c14615a501872cc4d8e71f"
server_key_version: "kuduclusterkey@0"
14:10:08.496 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
I20260504 14:10:08.496047  6359 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
14:10:08.496 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.496 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.496 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.497 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.497 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.497 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.497 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.497 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.498 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.498 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.498 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.498 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.499 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.499 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.499 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.499 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.500 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.500 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.500 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.500 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.500 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.501 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.501 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.501 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.501 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.501 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.502 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.502 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.503 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.503 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.503 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.503 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.504 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.504 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.504 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.504 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.505 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.505 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.505 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.505 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.506 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.506 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.506 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.506 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.507 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.507 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.507 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.507 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.508 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.508 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.508 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.508 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.509 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.509 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.509 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.509 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.510 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.510 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.510 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.510 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.511 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.511 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.511 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.511 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.512 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.512 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.512 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.512 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:08.513 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:08.513 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
I20260504 14:10:08.537976  6359 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:10:08.546633  6359 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:10:08.546870  6359 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:10:08.556012  6359 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:37897
I20260504 14:10:08.556022  6428 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:37897 every 8 connection(s)
I20260504 14:10:08.557081  6359 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:10:08.560068  6429 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:10:08.563741 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 6359
I20260504 14:10:08.563898 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:10:08.564181 26619 external_mini_cluster.cc:1468] Setting key 09d929b465f37008744d25932964de37
I20260504 14:10:08.566434  6429 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927: Bootstrap starting.
I20260504 14:10:08.569068  6429 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927: Neither blocks nor log segments found. Creating new log.
I20260504 14:10:08.569923  6429 log.cc:826] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927: Log is configured to *not* fsync() on all Append() calls
I20260504 14:10:08.572038  6429 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927: No bootstrap required, opened a new log
May 04 14:10:08 dist-test-slave-2x32 krb5kdc[5621](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903808, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:10:08.575839  6429 raft_consensus.cc:359] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "0a97dbe8ab0646d5b4416094cac5d927" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 37897 } }
I20260504 14:10:08.576057  6429 raft_consensus.cc:385] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:10:08.576118  6429 raft_consensus.cc:740] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 0a97dbe8ab0646d5b4416094cac5d927, State: Initialized, Role: FOLLOWER
I20260504 14:10:08.576637  6429 consensus_queue.cc:260] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "0a97dbe8ab0646d5b4416094cac5d927" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 37897 } }
I20260504 14:10:08.576800  6429 raft_consensus.cc:399] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:10:08.576861  6429 raft_consensus.cc:493] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:10:08.576964  6429 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:10:08.578204  6429 raft_consensus.cc:515] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "0a97dbe8ab0646d5b4416094cac5d927" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 37897 } }
I20260504 14:10:08.578616  6429 leader_election.cc:304] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 0a97dbe8ab0646d5b4416094cac5d927; no voters: 
I20260504 14:10:08.578996  6429 leader_election.cc:290] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:10:08.579099  6432 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:08.565749 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:40314 (local address 127.25.254.254:37897)
0504 14:10:08.566262 (+   513us) server_negotiation.cc:207] Beginning negotiation
0504 14:10:08.566275 (+    13us) server_negotiation.cc:400] Waiting for connection header
0504 14:10:08.566314 (+    39us) server_negotiation.cc:408] Connection header received
0504 14:10:08.567001 (+   687us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:10:08.567027 (+    26us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:10:08.567429 (+   402us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:10:08.567811 (+   382us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:10:08.568907 (+  1096us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:08.570049 (+  1142us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:10:08.570967 (+   918us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:08.571237 (+   270us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:10:08.573992 (+  2755us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:10:08.574016 (+    24us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:10:08.574032 (+    16us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:10:08.574066 (+    34us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:10:08.576173 (+  2107us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:10:08.576665 (+   492us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:10:08.576671 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:10:08.576678 (+     7us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:10:08.576763 (+    85us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:10:08.577047 (+   284us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:10:08.577051 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:10:08.577053 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:10:08.577497 (+   444us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:10:08.577689 (+   192us) server_negotiation.cc:1036] Waiting for connection context
0504 14:10:08.578016 (+   327us) server_negotiation.cc:300] Negotiation successful
0504 14:10:08.578273 (+   257us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":304,"thread_start_us":130,"threads_started":1}
I20260504 14:10:08.580084  6429 sys_catalog.cc:565] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927 [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:10:08.581501  6434 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:10:08.581715  6434 raft_consensus.cc:697] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927 [term 1 LEADER]: Becoming Leader. State: Replica: 0a97dbe8ab0646d5b4416094cac5d927, State: Running, Role: LEADER
I20260504 14:10:08.582005  6434 consensus_queue.cc:237] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "0a97dbe8ab0646d5b4416094cac5d927" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 37897 } }
I20260504 14:10:08.583876  6441 sys_catalog.cc:455] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "0a97dbe8ab0646d5b4416094cac5d927" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "0a97dbe8ab0646d5b4416094cac5d927" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 37897 } } }
I20260504 14:10:08.584002  6441 sys_catalog.cc:458] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927 [sys.catalog]: This master's current role is: LEADER
I20260504 14:10:08.584107  6429 ranger_client.cc:318] Using new properties file: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/master-0/logs/kudu-ranger-subprocess-log4j2.properties
I20260504 14:10:08.584354  6448 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:10:08.585351  6435 sys_catalog.cc:455] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 0a97dbe8ab0646d5b4416094cac5d927. Latest consensus state: current_term: 1 leader_uuid: "0a97dbe8ab0646d5b4416094cac5d927" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "0a97dbe8ab0646d5b4416094cac5d927" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 37897 } } }
I20260504 14:10:08.585546  6435 sys_catalog.cc:458] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927 [sys.catalog]: This master's current role is: LEADER
I20260504 14:10:08.592103  6448 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:10:08.599225  6448 catalog_manager.cc:1357] Generated new cluster ID: f8bc360509564b8baf9987cf1b55706b
I20260504 14:10:08.599313  6448 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:10:08.623517  6448 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:10:08.624728  6448 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:10:08.642912  6448 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927: Generated new TSK 0
I20260504 14:10:08.643797  6448 catalog_manager.cc:1524] Initializing in-progress tserver states...
WARNING: no policy specified for kudu/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:10:09.443037 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--encryption_key_provider=ranger-kms
--encryption_cluster_key_name=kuduclusterkey
--ranger_kms_url=127.25.254.212:34499/kms
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:0
--local_ip_for_outbound_sockets=127.25.254.193
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:37897
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:44557
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:10:09.580072  6488 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:10:09.580327  6488 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:10:09.580437  6488 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:10:09.585089  6488 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:10:09.585243  6488 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:10:09.585412  6488 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:10:09.590787  6488 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:44557
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--encryption_cluster_key_name=kuduclusterkey
--encryption_key_provider=ranger-kms
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-0/wal
--ranger_kms_url=127.25.254.212:34499/kms
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:37897
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.6488
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:10:09.592507  6488 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:10:09.593695  6488 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:10:09.601893  6496 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:10:09.602408  6495 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:10:09.602473  6498 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:10:09.603703  6488 server_base.cc:1061] running on GCE node
I20260504 14:10:09.604103  6488 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:10:09.604782  6488 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:10:09.605960  6488 hybrid_clock.cc:648] HybridClock initialized: now 1777903809605940 us; error 39 us; skew 500 ppm
May 04 14:10:09 dist-test-slave-2x32 krb5kdc[5621](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903809, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:10:09.609771  6488 init.cc:377] Logged in from keytab as kudu/127.25.254.193@KRBTEST.COM (short username kudu)
I20260504 14:10:09.611270  6488 webserver.cc:492] Webserver started at http://127.25.254.193:38277/ using document root <none> and password file <none>
I20260504 14:10:09.612073  6488 fs_manager.cc:362] Metadata directory not provided
I20260504 14:10:09.612430  6488 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:10:09.612763  6488 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
May 04 14:10:09 dist-test-slave-2x32 krb5kdc[5621](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903809, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for HTTP/127.25.254.212@KRBTEST.COM
14:10:09.617 [http-nio-34499-exec-6] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Got token null from httpRequest http://127.25.254.212:34499/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1
14:10:09.617 [http-nio-34499-exec-6] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:34499/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1] triggering authentication. handler: class org.apache.hadoop.security.token.delegation.web.KerberosDelegationTokenAuthenticationHandler
14:10:09.617 [http-nio-34499-exec-6] DEBUG org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationHandler -- Falling back to class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler (req=org.apache.catalina.connector.RequestFacade@e3a236f)
14:10:09.625 [http-nio-34499-exec-6] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:34499/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1] user [kudu] authenticated
14:10:09.632 [http-nio-34499-exec-6] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- ==> generateEncryptedKeys(name=kuduclusterkey, eekOp=generate, numKeys=1)
14:10:09.632 [http-nio-34499-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:09.632 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:09.632 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:09.632 [http-nio-34499-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:09.632 [http-nio-34499-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(GENERATE_EEK, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey)
14:10:09.633 [http-nio-34499-exec-6] DEBUG org.apache.hadoop.security.UserGroupInformation -- Failed to get groups for user kudu
java.io.IOException: No groups found for user kudu
	at org.apache.hadoop.security.Groups.noGroupsForUser(Groups.java:200)
	at org.apache.hadoop.security.Groups.getGroups(Groups.java:223)
	at org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1755)
	at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1743)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest.<init>(RangerKmsAuthorizer.java:367)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:247)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:266)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:164)
	at org.apache.hadoop.crypto.key.kms.server.KMS.assertAccess(KMS.java:745)
	at org.apache.hadoop.crypto.key.kms.server.KMS.generateEncryptedKeys(KMS.java:529)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:09.633 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } }, policyType=0)
14:10:09.634 [http-nio-34499-exec-6] INFO org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- RangerPolicyEngineImpl.evaluatePolicies(70cb20ec_0, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:09.634 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- ==> preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:09.634 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:10:09.634 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:10:09.634 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- getMatchedZonesForResourceAndChildren(resource=RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:10:09.634 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- No context-enrichers!!!
14:10:09.634 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- <== preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:09.634 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0)
14:10:09.634 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- zoneNames:[null]
14:10:09.634 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- zoneName:[null]
14:10:09.634 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:09.634 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:09.635 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:09.635 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:09.635 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.setAuditEnabledFromCache()
14:10:09.635 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.setAuditEnabledFromCache():false
14:10:09.635 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:10:09.635 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:10:09.635 [http-nio-34499-exec-6] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-34499-exec-6:RangerResourceTrie.traverse(resource=kuduclusterkey):17601:17943
14:10:09.635 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@665f0c5a
14:10:09.635 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@5a92a871]]
14:10:09.635 [http-nio-34499-exec-6] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-34499-exec-6:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):354077:546510
14:10:09.636 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=1
14:10:09.636 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:09 UTC 2026)
14:10:09.636 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:09 UTC 2026) : true
14:10:09.636 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:09.637 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:09.637 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:10:09.637 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:10:09.637 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:10:09.642 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:10:09.642 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:10:09.643 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:10:09.643 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:09.643 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:09.643 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- isAllValuesRequested(kuduclusterkey): false
14:10:09.644 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:09.644 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): true
14:10:09.644 [http-nio-34499-exec-6] DEBUG org.apache.ranger.perf.policyresourcematcher.match -- [PERF]:http-nio-34499-exec-6:RangerDefaultPolicyResourceMatcher.getMatchType():857493:6998786
14:10:09.644 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:09.644 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.matchPolicyCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:09.644 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.matchCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): true
14:10:09.644 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}, SELF)
14:10:09.644 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- Checking for accessType:[generateeek]
14:10:09.645 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@21ef39ac)
14:10:09.645 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@21ef39ac): null
14:10:09.645 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@21ef39ac)
14:10:09.645 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@21ef39ac)
14:10:09.645 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@21ef39ac)
14:10:09.645 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null)
14:10:09.646 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null): false
14:10:09.646 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@21ef39ac): false
14:10:09.646 [http-nio-34499-exec-6] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-34499-exec-6:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):226996:1116635
14:10:09.646 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@21ef39ac): false
14:10:09.646 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@21ef39ac)
14:10:09.646 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@21ef39ac)
14:10:09.646 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null)
14:10:09.646 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null): true
14:10:09.646 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@21ef39ac): true
14:10:09.647 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@21ef39ac)
14:10:09.647 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@21ef39ac): true
14:10:09.647 [http-nio-34499-exec-6] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-34499-exec-6:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):212508:408924
14:10:09.647 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@21ef39ac): true
14:10:09.647 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@21ef39ac): org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator@6cf7d42f
14:10:09.647 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:09.647 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:09.647 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{generateeek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF)
14:10:09.647 [http-nio-34499-exec-6] DEBUG org.apache.ranger.perf.policy.request -- [PERF]:http-nio-34499-exec-6:RangerPolicyEvaluator.evaluate(requestHashCode=70cb20ec,policyId=3, policyName=all):2469202:11476872
14:10:09.647 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{generateeek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:09.648 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.storeAuditEnabledInCache()
14:10:09.648 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.storeAuditEnabledInCache()
14:10:09.648 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:09.648 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:09.648 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:09.648 [http-nio-34499-exec-6] DEBUG org.apache.ranger.perf.policyengine.request -- [PERF]:http-nio-34499-exec-6:RangerPolicyEngine.evaluatePolicies(requestHashCode=70cb20ec_0):4188942:14387162
14:10:09.648 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType=0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:09.648 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:09.648 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:09.648 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:10:09.648 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:10:09.648 [http-nio-34499-exec-6] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-34499-exec-6:RangerResourceTrie.traverse(resource=kuduclusterkey):38436:39452
14:10:09.648 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@4a1df8b2
14:10:09.649 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@33fedecb, org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@54f0041f]]
14:10:09.649 [http-nio-34499-exec-6] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-34499-exec-6:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):263711:415325
14:10:09.649 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=2
14:10:09.649 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:09.649 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:09.649 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:09.649 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:09.649 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:09.649 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:09.649 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:09.649 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:09.649 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:09.649 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:09.649 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:10:09.650 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchUserGroupRole(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=false
14:10:09.650 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:10:09.650 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:09.650 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:09.650 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:10:09.650 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:09.650 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:09.650 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:09.650 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- generateNextAuditEventId(): b6caadd3-a810-48a3-a13b-1ded6e6de8f1-2
14:10:09.650 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={b6caadd3-a810-48a3-a13b-1ded6e6de8f1-2} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:09 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-2;seq_num=4;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null}
14:10:09.650 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:09 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-2;seq_num=4;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:10:09.651 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:09 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-2;seq_num=5;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:10:09.651 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={b6caadd3-a810-48a3-a13b-1ded6e6de8f1-2} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:09.651 [http-nio-34499-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.hasAccess(GENERATE_EEK, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey): true
14:10:09.651 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:09.651 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:09.651 [http-nio-34499-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:09.651 [http-nio-34499-exec-6] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS)][action: org.apache.hadoop.crypto.key.kms.server.KMS$$Lambda$240/0x00007f6bcc64e598@7d5b80b4]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.hadoop.crypto.key.kms.server.KMS.generateEncryptedKeys(KMS.java:530)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:09.652 [http-nio-34499-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:10:09.652 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:09.652 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:09.652 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:09.652 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:09.652 [http-nio-34499-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:10:09.652 [http-nio-34499-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:09.653 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:09.653 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:09.653 [http-nio-34499-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:09.653 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:09.653 [http-nio-34499-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:09.653 [http-nio-34499-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:09.653 [http-nio-34499-exec-6] INFO kms-audit -- OK[op=GENERATE_EEK, key=kuduclusterkey, user=kudu/127.25.254.193@KRBTEST.COM, accessCount=1, interval=0ms] 
14:10:09.653 [http-nio-34499-exec-6] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- <== generateEncryptedKeys(name=kuduclusterkey, eekOp=generate, numKeys=1)
I20260504 14:10:09.658331  6488 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-0/data/instance:
uuid: "5967e8b7dd55430493459214df9f79fa"
format_stamp: "Formatted at 2026-05-04 14:10:09 on dist-test-slave-2x32"
server_key: "eb3d6901b5eae81ffa22eb84bdf66d59"
server_key_iv: "09e067af67600eb8c10fc0675f72f8f2"
server_key_version: "kuduclusterkey@0"
I20260504 14:10:09.658998  6488 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance:
uuid: "5967e8b7dd55430493459214df9f79fa"
format_stamp: "Formatted at 2026-05-04 14:10:09 on dist-test-slave-2x32"
server_key: "eb3d6901b5eae81ffa22eb84bdf66d59"
server_key_iv: "09e067af67600eb8c10fc0675f72f8f2"
server_key_version: "kuduclusterkey@0"
I20260504 14:10:09.665795  6488 fs_manager.cc:696] Time spent creating directory manager: real 0.006s	user 0.004s	sys 0.000s
14:10:09.669 [http-nio-34499-exec-7] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Got token null from httpRequest http://127.25.254.212:34499/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt
14:10:09.669 [http-nio-34499-exec-7] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:34499/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt] triggering authentication. handler: class org.apache.hadoop.security.token.delegation.web.KerberosDelegationTokenAuthenticationHandler
14:10:09.669 [http-nio-34499-exec-7] DEBUG org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationHandler -- Falling back to class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler (req=org.apache.catalina.connector.RequestFacade@5f910aed)
14:10:09.674 [http-nio-34499-exec-7] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:34499/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt] user [kudu] authenticated
14:10:09.680 [http-nio-34499-exec-7] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- ==> handleEncryptedKeyOp(versionName=kuduclusterkey@0, eekOp=decrypt)
14:10:09.681 [http-nio-34499-exec-7] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:09.681 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:09.681 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:09.681 [http-nio-34499-exec-7] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:09.681 [http-nio-34499-exec-7] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(DECRYPT_EEK, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey)
14:10:09.681 [http-nio-34499-exec-7] DEBUG org.apache.hadoop.security.UserGroupInformation -- Failed to get groups for user kudu
java.io.IOException: No groups found for user kudu
	at org.apache.hadoop.security.Groups.noGroupsForUser(Groups.java:200)
	at org.apache.hadoop.security.Groups.getGroups(Groups.java:223)
	at org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1755)
	at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1743)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest.<init>(RangerKmsAuthorizer.java:367)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:247)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:266)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:164)
	at org.apache.hadoop.crypto.key.kms.server.KMS.assertAccess(KMS.java:745)
	at org.apache.hadoop.crypto.key.kms.server.KMS.handleEncryptedKeyOp(KMS.java:664)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } }, policyType=0)
14:10:09.682 [http-nio-34499-exec-7] INFO org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- RangerPolicyEngineImpl.evaluatePolicies(47fa43d6_0, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- ==> preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- getMatchedZonesForResourceAndChildren(resource=RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- No context-enrichers!!!
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- <== preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0)
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- zoneNames:[null]
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- zoneName:[null]
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.setAuditEnabledFromCache()
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.setAuditEnabledFromCache():false
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-34499-exec-7:RangerResourceTrie.traverse(resource=kuduclusterkey):14958:15631
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@2131f1d1
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@5a92a871]]
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-34499-exec-7:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):134288:133686
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=1
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:09 UTC 2026)
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:09 UTC 2026) : true
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:09.682 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- isAllValuesRequested(kuduclusterkey): false
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): true
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.perf.policyresourcematcher.match -- [PERF]:http-nio-34499-exec-7:RangerDefaultPolicyResourceMatcher.getMatchType():327254:327359
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.matchPolicyCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.matchCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): true
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}, SELF)
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- Checking for accessType:[decrypteek]
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3)
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3): null
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3)
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3)
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3)
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null)
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null): false
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3): false
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-34499-exec-7:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):71182:71185
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3): false
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3)
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3)
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null)
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null): true
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3): true
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3)
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3): true
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-34499-exec-7:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):75522:75944
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3): true
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3): org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator@6cf7d42f
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{decrypteek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF)
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.perf.policy.request -- [PERF]:http-nio-34499-exec-7:RangerPolicyEvaluator.evaluate(requestHashCode=47fa43d6,policyId=3, policyName=all):974449:973830
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{decrypteek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.storeAuditEnabledInCache()
14:10:09.683 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.storeAuditEnabledInCache()
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.perf.policyengine.request -- [PERF]:http-nio-34499-exec-7:RangerPolicyEngine.evaluatePolicies(requestHashCode=47fa43d6_0):1955127:1961786
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType=0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-34499-exec-7:RangerResourceTrie.traverse(resource=kuduclusterkey):13257:14014
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@70441a1b
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@33fedecb, org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@54f0041f]]
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-34499-exec-7:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):134784:135317
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=2
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchUserGroupRole(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=false
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:09 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:09.684 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- generateNextAuditEventId(): b6caadd3-a810-48a3-a13b-1ded6e6de8f1-3
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={b6caadd3-a810-48a3-a13b-1ded6e6de8f1-3} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:09 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-3;seq_num=6;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null}
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:09 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-3;seq_num=6;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:09 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-3;seq_num=7;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={b6caadd3-a810-48a3-a13b-1ded6e6de8f1-3} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.hasAccess(DECRYPT_EEK, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey): true
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS)][action: org.apache.hadoop.crypto.key.kms.server.KMS$$Lambda$241/0x00007f6bcc654208@4ba4d938]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.hadoop.crypto.key.kms.server.KMS.handleEncryptedKeyOp(KMS.java:666)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:09.685 [http-nio-34499-exec-7] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:09.686 [http-nio-34499-exec-7] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:09.686 [http-nio-34499-exec-7] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:09.686 [http-nio-34499-exec-7] INFO kms-audit -- OK[op=DECRYPT_EEK, key=kuduclusterkey, user=kudu/127.25.254.193@KRBTEST.COM, accessCount=1, interval=0ms] 
14:10:09.686 [http-nio-34499-exec-7] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- <== handleEncryptedKeyOp(versionName=kuduclusterkey@0, eekOp=decrypt)
I20260504 14:10:09.691749  6507 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:10:09.693174  6488 fs_manager.cc:730] Time spent opening block manager: real 0.004s	user 0.003s	sys 0.000s
I20260504 14:10:09.693328  6488 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "5967e8b7dd55430493459214df9f79fa"
format_stamp: "Formatted at 2026-05-04 14:10:09 on dist-test-slave-2x32"
server_key: "eb3d6901b5eae81ffa22eb84bdf66d59"
server_key_iv: "09e067af67600eb8c10fc0675f72f8f2"
server_key_version: "kuduclusterkey@0"
I20260504 14:10:09.693478  6488 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:10:09.708417  6488 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:10:09.711905  6488 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:10:09.712105  6488 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:10:09.712805  6488 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:10:09.714039  6488 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:10:09.714110  6488 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:10:09.714221  6488 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:10:09.714255  6488 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:10:09.729256  6488 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:38475
I20260504 14:10:09.731292  6620 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:38475 every 8 connection(s)
I20260504 14:10:09.732340  6488 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
I20260504 14:10:09.732815 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 6488
I20260504 14:10:09.732972 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance
I20260504 14:10:09.733275 26619 external_mini_cluster.cc:1468] Setting key c117432b9fc0c235d008c1ae97dc4773
May 04 14:10:09 dist-test-slave-2x32 krb5kdc[5621](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903809, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
WARNING: no policy specified for kudu/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.194@KRBTEST.COM" created.
I20260504 14:10:09.763309  6625 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:09.739632 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:37897 (local address 127.25.254.193:51747)
0504 14:10:09.740922 (+  1290us) negotiation.cc:107] Waiting for socket to connect
0504 14:10:09.740972 (+    50us) client_negotiation.cc:175] Beginning negotiation
0504 14:10:09.742005 (+  1033us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:10:09.745640 (+  3635us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:10:09.745654 (+    14us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:10:09.746049 (+   395us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:10:09.746814 (+   765us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:10:09.746835 (+    21us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:10:09.748021 (+  1186us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:10:09.748025 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:10:09.748655 (+   630us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:10:09.748665 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:10:09.748903 (+   238us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:10:09.750829 (+  1926us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:10:09.750859 (+    30us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:10:09.757158 (+  6299us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:10:09.759814 (+  2656us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:10:09.759822 (+     8us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:10:09.759838 (+    16us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:10:09.760234 (+   396us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:10:09.760697 (+   463us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:10:09.760701 (+     4us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:10:09.760703 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:10:09.760863 (+   160us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:10:09.761349 (+   486us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:10:09.761356 (+     7us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:10:09.761758 (+   402us) client_negotiation.cc:770] Sending connection context
0504 14:10:09.761993 (+   235us) client_negotiation.cc:241] Negotiation successful
0504 14:10:09.762313 (+   320us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":1060,"thread_start_us":145,"threads_started":1}
I20260504 14:10:09.764262  6624 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:09.738613 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:51747 (local address 127.25.254.254:37897)
0504 14:10:09.744994 (+  6381us) server_negotiation.cc:207] Beginning negotiation
0504 14:10:09.744999 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:10:09.745018 (+    19us) server_negotiation.cc:408] Connection header received
0504 14:10:09.745071 (+    53us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:10:09.745075 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:10:09.745141 (+    66us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:10:09.745271 (+   130us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:10:09.747020 (+  1749us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:09.747851 (+   831us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:10:09.750235 (+  2384us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:09.750442 (+   207us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:10:09.757449 (+  7007us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:10:09.757482 (+    33us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:10:09.757486 (+     4us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:10:09.757526 (+    40us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:10:09.759628 (+  2102us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:10:09.760373 (+   745us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:10:09.760377 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:10:09.760380 (+     3us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:10:09.760438 (+    58us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:10:09.760978 (+   540us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:10:09.760982 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:10:09.760984 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:10:09.761174 (+   190us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:10:09.761286 (+   112us) server_negotiation.cc:1036] Waiting for connection context
0504 14:10:09.763843 (+  2557us) server_negotiation.cc:300] Negotiation successful
0504 14:10:09.763991 (+   148us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":6267,"thread_start_us":101,"threads_started":1}
I20260504 14:10:09.765623  6621 heartbeater.cc:344] Connected to a master server at 127.25.254.254:37897
I20260504 14:10:09.765897  6621 heartbeater.cc:461] Registering TS with master...
I20260504 14:10:09.766575  6621 heartbeater.cc:507] Master 127.25.254.254:37897 requested a full tablet report, sending...
I20260504 14:10:09.768525  6393 ts_manager.cc:194] Registered new tserver with Master: 5967e8b7dd55430493459214df9f79fa (127.25.254.193:38475)
I20260504 14:10:09.770134  6393 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.193@KRBTEST.COM'} at 127.25.254.193:51747
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.194@KRBTEST.COM" created.
May 04 14:10:09 dist-test-slave-2x32 krb5kdc[5621](info): AS_REQ (1 etypes {17}) 127.0.0.1: ISSUE: authtime 1777903809, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:10:09.850987 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--encryption_key_provider=ranger-kms
--encryption_cluster_key_name=kuduclusterkey
--ranger_kms_url=127.25.254.212:34499/kms
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:0
--local_ip_for_outbound_sockets=127.25.254.194
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:37897
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:44557
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:10:09.990679  6629 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:10:09.991164  6629 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:10:09.991334  6629 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:10:09.996416  6629 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:10:09.996620  6629 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:10:09.996804  6629 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:10:10.003610  6629 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:44557
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--encryption_cluster_key_name=kuduclusterkey
--encryption_key_provider=ranger-kms
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-1/wal
--ranger_kms_url=127.25.254.212:34499/kms
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:37897
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.6629
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:10:10.005401  6629 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:10:10.006640  6629 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:10:10.015563  6638 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:10:10.016548  6636 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:10:10.016743  6635 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:10:10.017696  6629 server_base.cc:1061] running on GCE node
I20260504 14:10:10.018326  6629 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:10:10.019086  6629 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:10:10.020292  6629 hybrid_clock.cc:648] HybridClock initialized: now 1777903810020231 us; error 90 us; skew 500 ppm
May 04 14:10:10 dist-test-slave-2x32 krb5kdc[5621](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903810, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:10:10.024551  6629 init.cc:377] Logged in from keytab as kudu/127.25.254.194@KRBTEST.COM (short username kudu)
I20260504 14:10:10.026087  6629 webserver.cc:492] Webserver started at http://127.25.254.194:46765/ using document root <none> and password file <none>
I20260504 14:10:10.026933  6629 fs_manager.cc:362] Metadata directory not provided
I20260504 14:10:10.027084  6629 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:10:10.027462  6629 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
May 04 14:10:10 dist-test-slave-2x32 krb5kdc[5621](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903810, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for HTTP/127.25.254.212@KRBTEST.COM
14:10:10.032 [http-nio-34499-exec-9] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Got token null from httpRequest http://127.25.254.212:34499/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1
14:10:10.032 [http-nio-34499-exec-9] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:34499/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1] triggering authentication. handler: class org.apache.hadoop.security.token.delegation.web.KerberosDelegationTokenAuthenticationHandler
14:10:10.032 [http-nio-34499-exec-9] DEBUG org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationHandler -- Falling back to class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler (req=org.apache.catalina.connector.RequestFacade@55d5606e)
14:10:10.037 [http-nio-34499-exec-9] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:34499/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1] user [kudu] authenticated
14:10:10.039 [http-nio-34499-exec-9] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- ==> generateEncryptedKeys(name=kuduclusterkey, eekOp=generate, numKeys=1)
14:10:10.039 [http-nio-34499-exec-9] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:10.039 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:10.039 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:10.039 [http-nio-34499-exec-9] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:10.039 [http-nio-34499-exec-9] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(GENERATE_EEK, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey)
14:10:10.039 [http-nio-34499-exec-9] DEBUG org.apache.hadoop.security.UserGroupInformation -- Failed to get groups for user kudu
java.io.IOException: No groups found for user kudu
	at org.apache.hadoop.security.Groups.noGroupsForUser(Groups.java:200)
	at org.apache.hadoop.security.Groups.getGroups(Groups.java:223)
	at org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1755)
	at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1743)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest.<init>(RangerKmsAuthorizer.java:367)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:247)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:266)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:164)
	at org.apache.hadoop.crypto.key.kms.server.KMS.assertAccess(KMS.java:745)
	at org.apache.hadoop.crypto.key.kms.server.KMS.generateEncryptedKeys(KMS.java:529)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:10.040 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } }, policyType=0)
14:10:10.040 [http-nio-34499-exec-9] INFO org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- RangerPolicyEngineImpl.evaluatePolicies(1a2f8e9f_0, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:10.040 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- ==> preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:10.040 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:10:10.040 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:10:10.040 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- getMatchedZonesForResourceAndChildren(resource=RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:10:10.040 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- No context-enrichers!!!
14:10:10.040 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- <== preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:10.040 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0)
14:10:10.040 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- zoneNames:[null]
14:10:10.040 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- zoneName:[null]
14:10:10.040 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:10.040 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:10.040 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.040 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.setAuditEnabledFromCache()
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.setAuditEnabledFromCache():false
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-34499-exec-9:RangerResourceTrie.traverse(resource=kuduclusterkey):15172:15919
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@64874ba4
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@5a92a871]]
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-34499-exec-9:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):158260:157904
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=1
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:10 UTC 2026)
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:10 UTC 2026) : true
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- isAllValuesRequested(kuduclusterkey): false
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): true
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.perf.policyresourcematcher.match -- [PERF]:http-nio-34499-exec-9:RangerDefaultPolicyResourceMatcher.getMatchType():327743:328025
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.matchPolicyCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.matchCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): true
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}, SELF)
14:10:10.041 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- Checking for accessType:[generateeek]
14:10:10.046 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434)
14:10:10.046 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434): null
14:10:10.046 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434)
14:10:10.046 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434)
14:10:10.046 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434)
14:10:10.046 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null)
14:10:10.046 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null): false
14:10:10.046 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434): false
14:10:10.046 [http-nio-34499-exec-9] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-34499-exec-9:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):123794:132439
14:10:10.046 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434): false
14:10:10.046 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434)
14:10:10.046 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434)
14:10:10.046 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null)
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null): true
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434): true
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434)
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434): true
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-34499-exec-9:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):204876:205353
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434): true
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434): org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator@6cf7d42f
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{generateeek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF)
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.perf.policy.request -- [PERF]:http-nio-34499-exec-9:RangerPolicyEvaluator.evaluate(requestHashCode=1a2f8e9f,policyId=3, policyName=all):1458894:6015305
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{generateeek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.storeAuditEnabledInCache()
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.storeAuditEnabledInCache()
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.perf.policyengine.request -- [PERF]:http-nio-34499-exec-9:RangerPolicyEngine.evaluatePolicies(requestHashCode=1a2f8e9f_0):2405983:6961563
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType=0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-34499-exec-9:RangerResourceTrie.traverse(resource=kuduclusterkey):15280:15685
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@1c3d4fbe
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@33fedecb, org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@54f0041f]]
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-34499-exec-9:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):183073:183485
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=2
14:10:10.047 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchUserGroupRole(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=false
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- generateNextAuditEventId(): b6caadd3-a810-48a3-a13b-1ded6e6de8f1-4
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={b6caadd3-a810-48a3-a13b-1ded6e6de8f1-4} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:10 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-4;seq_num=8;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null}
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:10 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-4;seq_num=8;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:10 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-4;seq_num=9;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={b6caadd3-a810-48a3-a13b-1ded6e6de8f1-4} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.hasAccess(GENERATE_EEK, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey): true
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:10.048 [http-nio-34499-exec-9] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS)][action: org.apache.hadoop.crypto.key.kms.server.KMS$$Lambda$240/0x00007f6bcc64e598@2484df5f]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.hadoop.crypto.key.kms.server.KMS.generateEncryptedKeys(KMS.java:530)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:10.049 [http-nio-34499-exec-9] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:10:10.049 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:10.049 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:10.049 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:10.049 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:10.049 [http-nio-34499-exec-9] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:10:10.049 [http-nio-34499-exec-9] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:10.049 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:10.054 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:10.054 [http-nio-34499-exec-9] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:10.054 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:10.054 [http-nio-34499-exec-9] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:10.054 [http-nio-34499-exec-9] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:10.054 [http-nio-34499-exec-9] INFO kms-audit -- OK[op=GENERATE_EEK, key=kuduclusterkey, user=kudu/127.25.254.194@KRBTEST.COM, accessCount=1, interval=0ms] 
14:10:10.054 [http-nio-34499-exec-9] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- <== generateEncryptedKeys(name=kuduclusterkey, eekOp=generate, numKeys=1)
I20260504 14:10:10.062811  6629 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-1/data/instance:
uuid: "93019787649c421e94cbf1022725e556"
format_stamp: "Formatted at 2026-05-04 14:10:10 on dist-test-slave-2x32"
server_key: "c670cbb0c2487eb1e24baad3ade698bb"
server_key_iv: "f5c37658c1b541112fa58c992471e334"
server_key_version: "kuduclusterkey@0"
I20260504 14:10:10.064028  6629 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance:
uuid: "93019787649c421e94cbf1022725e556"
format_stamp: "Formatted at 2026-05-04 14:10:10 on dist-test-slave-2x32"
server_key: "c670cbb0c2487eb1e24baad3ade698bb"
server_key_iv: "f5c37658c1b541112fa58c992471e334"
server_key_version: "kuduclusterkey@0"
I20260504 14:10:10.071820  6629 fs_manager.cc:696] Time spent creating directory manager: real 0.007s	user 0.004s	sys 0.000s
14:10:10.075 [http-nio-34499-exec-10] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Got token null from httpRequest http://127.25.254.212:34499/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt
14:10:10.075 [http-nio-34499-exec-10] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:34499/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt] triggering authentication. handler: class org.apache.hadoop.security.token.delegation.web.KerberosDelegationTokenAuthenticationHandler
14:10:10.075 [http-nio-34499-exec-10] DEBUG org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationHandler -- Falling back to class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler (req=org.apache.catalina.connector.RequestFacade@2f6f3a95)
14:10:10.084 [http-nio-34499-exec-10] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:34499/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt] user [kudu] authenticated
14:10:10.088 [http-nio-34499-exec-10] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- ==> handleEncryptedKeyOp(versionName=kuduclusterkey@0, eekOp=decrypt)
14:10:10.098 [http-nio-34499-exec-10] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:10.098 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:10.098 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:10.098 [http-nio-34499-exec-10] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:10.098 [http-nio-34499-exec-10] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(DECRYPT_EEK, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey)
14:10:10.099 [http-nio-34499-exec-10] DEBUG org.apache.hadoop.security.UserGroupInformation -- Failed to get groups for user kudu
java.io.IOException: No groups found for user kudu
	at org.apache.hadoop.security.Groups.noGroupsForUser(Groups.java:200)
	at org.apache.hadoop.security.Groups.getGroups(Groups.java:223)
	at org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1755)
	at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1743)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest.<init>(RangerKmsAuthorizer.java:367)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:247)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:266)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:164)
	at org.apache.hadoop.crypto.key.kms.server.KMS.assertAccess(KMS.java:745)
	at org.apache.hadoop.crypto.key.kms.server.KMS.handleEncryptedKeyOp(KMS.java:664)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:10.099 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } }, policyType=0)
14:10:10.100 [http-nio-34499-exec-10] INFO org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- RangerPolicyEngineImpl.evaluatePolicies(3678fe26_0, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:10.100 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- ==> preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:10.100 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:10:10.100 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:10:10.100 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- getMatchedZonesForResourceAndChildren(resource=RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:10:10.100 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- No context-enrichers!!!
14:10:10.100 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- <== preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:10.100 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0)
14:10:10.100 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- zoneNames:[null]
14:10:10.100 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- zoneName:[null]
14:10:10.100 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:10.100 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:10.101 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.101 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.101 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.setAuditEnabledFromCache()
14:10:10.101 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.setAuditEnabledFromCache():false
14:10:10.101 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:10:10.101 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:10:10.101 [http-nio-34499-exec-10] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-34499-exec-10:RangerResourceTrie.traverse(resource=kuduclusterkey):17938:18736
14:10:10.101 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@513fc960
14:10:10.101 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@5a92a871]]
14:10:10.102 [http-nio-34499-exec-10] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-34499-exec-10:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):230875:442149
14:10:10.102 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=1
14:10:10.102 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:10 UTC 2026)
14:10:10.104 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:10 UTC 2026) : true
14:10:10.104 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.104 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:10.104 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:10:10.104 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:10:10.104 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:10:10.104 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:10:10.105 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:10:10.105 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:10:10.105 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:10.105 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:10.105 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- isAllValuesRequested(kuduclusterkey): false
14:10:10.105 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:10.105 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): true
14:10:10.105 [http-nio-34499-exec-10] DEBUG org.apache.ranger.perf.policyresourcematcher.match -- [PERF]:http-nio-34499-exec-10:RangerDefaultPolicyResourceMatcher.getMatchType():480346:985426
14:10:10.105 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:10.105 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.matchPolicyCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:10.105 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.matchCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): true
14:10:10.106 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}, SELF)
14:10:10.106 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- Checking for accessType:[decrypteek]
14:10:10.106 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@4e38a2db)
14:10:10.106 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@4e38a2db): null
14:10:10.106 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@4e38a2db)
14:10:10.106 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@4e38a2db)
14:10:10.106 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@4e38a2db)
14:10:10.107 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null)
14:10:10.107 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null): false
14:10:10.107 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@4e38a2db): false
14:10:10.107 [http-nio-34499-exec-10] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-34499-exec-10:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):185307:344875
14:10:10.107 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@4e38a2db): false
14:10:10.107 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@4e38a2db)
14:10:10.107 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@4e38a2db)
14:10:10.107 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null)
14:10:10.107 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null): true
14:10:10.107 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@4e38a2db): true
14:10:10.107 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@4e38a2db)
14:10:10.107 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@4e38a2db): true
14:10:10.107 [http-nio-34499-exec-10] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-34499-exec-10:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):181149:360689
14:10:10.107 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@4e38a2db): true
14:10:10.108 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@4e38a2db): org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator@6cf7d42f
14:10:10.108 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:10.108 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:10.108 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{decrypteek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF)
14:10:10.108 [http-nio-34499-exec-10] DEBUG org.apache.ranger.perf.policy.request -- [PERF]:http-nio-34499-exec-10:RangerPolicyEvaluator.evaluate(requestHashCode=3678fe26,policyId=3, policyName=all):1840086:3942582
14:10:10.108 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{decrypteek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.108 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.storeAuditEnabledInCache()
14:10:10.108 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.storeAuditEnabledInCache()
14:10:10.108 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:10.108 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:10.108 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:10.108 [http-nio-34499-exec-10] DEBUG org.apache.ranger.perf.policyengine.request -- [PERF]:http-nio-34499-exec-10:RangerPolicyEngine.evaluatePolicies(requestHashCode=3678fe26_0):3407969:8858207
14:10:10.109 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType=0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:10.109 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.109 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.109 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:10:10.109 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:10:10.109 [http-nio-34499-exec-10] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-34499-exec-10:RangerResourceTrie.traverse(resource=kuduclusterkey):33519:33714
14:10:10.109 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@ba347c
14:10:10.109 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@33fedecb, org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@54f0041f]]
14:10:10.109 [http-nio-34499-exec-10] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-34499-exec-10:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):242192:397957
14:10:10.109 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=2
14:10:10.109 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:10.110 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.110 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.110 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:10.110 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:10.110 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.111 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.111 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:10.111 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.111 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.111 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:10:10.111 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchUserGroupRole(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=false
14:10:10.111 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:10:10.111 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.111 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.111 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:10:10.111 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.112 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.112 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.112 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- generateNextAuditEventId(): b6caadd3-a810-48a3-a13b-1ded6e6de8f1-5
14:10:10.112 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={b6caadd3-a810-48a3-a13b-1ded6e6de8f1-5} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:10 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-5;seq_num=10;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null}
14:10:10.112 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:10 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-5;seq_num=10;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:10:10.112 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:10 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-5;seq_num=11;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:10:10.112 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={b6caadd3-a810-48a3-a13b-1ded6e6de8f1-5} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.112 [http-nio-34499-exec-10] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.hasAccess(DECRYPT_EEK, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey): true
14:10:10.112 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:10.112 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:10.112 [http-nio-34499-exec-10] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:10.112 [http-nio-34499-exec-10] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS)][action: org.apache.hadoop.crypto.key.kms.server.KMS$$Lambda$241/0x00007f6bcc654208@5e99c002]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.hadoop.crypto.key.kms.server.KMS.handleEncryptedKeyOp(KMS.java:666)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:10.113 [http-nio-34499-exec-10] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:10:10.114 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:10.114 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:10.114 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:10.114 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:10.114 [http-nio-34499-exec-10] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:10:10.114 [http-nio-34499-exec-10] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:10.114 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:10.114 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:10.114 [http-nio-34499-exec-10] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:10.114 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:10.114 [http-nio-34499-exec-10] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:10.114 [http-nio-34499-exec-10] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:10.114 [http-nio-34499-exec-10] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:10.119 [http-nio-34499-exec-10] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:10.119 [http-nio-34499-exec-10] INFO kms-audit -- OK[op=DECRYPT_EEK, key=kuduclusterkey, user=kudu/127.25.254.194@KRBTEST.COM, accessCount=1, interval=0ms] 
14:10:10.119 [http-nio-34499-exec-10] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- <== handleEncryptedKeyOp(versionName=kuduclusterkey@0, eekOp=decrypt)
I20260504 14:10:10.138701  6646 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:10:10.147809  6629 fs_manager.cc:730] Time spent opening block manager: real 0.024s	user 0.002s	sys 0.000s
I20260504 14:10:10.147990  6629 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "93019787649c421e94cbf1022725e556"
format_stamp: "Formatted at 2026-05-04 14:10:10 on dist-test-slave-2x32"
server_key: "c670cbb0c2487eb1e24baad3ade698bb"
server_key_iv: "f5c37658c1b541112fa58c992471e334"
server_key_version: "kuduclusterkey@0"
I20260504 14:10:10.148128  6629 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:10:10.245582  6629 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:10:10.251421  6629 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:10:10.251778  6629 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:10:10.252633  6629 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:10:10.254021  6629 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:10:10.254199  6629 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.001s	sys 0.000s
I20260504 14:10:10.254366  6629 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:10:10.254453  6629 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
May 04 14:10:10 dist-test-slave-2x32 krb5kdc[5621](info): TGS_REQ (1 etypes {17}) 127.0.0.1: ISSUE: authtime 1777903809, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for HTTP/127.25.254.212@KRBTEST.COM
I20260504 14:10:10.282331  6629 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:35361
I20260504 14:10:10.284066  6629 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
I20260504 14:10:10.288442 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 6629
I20260504 14:10:10.288607 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance
I20260504 14:10:10.288905 26619 external_mini_cluster.cc:1468] Setting key ec5ae19ae862549bc86180f987ccb291
May 04 14:10:10 dist-test-slave-2x32 krb5kdc[5621](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903810, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:10:10.291100  6759 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:35361 every 8 connection(s)
I20260504 14:10:10.314301  6764 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:10.294254 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.194:46051 (local address 127.25.254.254:37897)
0504 14:10:10.298361 (+  4107us) server_negotiation.cc:207] Beginning negotiation
0504 14:10:10.298375 (+    14us) server_negotiation.cc:400] Waiting for connection header
0504 14:10:10.302239 (+  3864us) server_negotiation.cc:408] Connection header received
0504 14:10:10.302298 (+    59us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:10:10.302301 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:10:10.302363 (+    62us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:10:10.302495 (+   132us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:10:10.304188 (+  1693us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.304928 (+   740us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:10:10.306140 (+  1212us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.306364 (+   224us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:10:10.309364 (+  3000us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:10:10.309403 (+    39us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:10:10.309405 (+     2us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:10:10.309435 (+    30us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:10:10.311456 (+  2021us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:10:10.312180 (+   724us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:10:10.312186 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:10:10.312188 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:10:10.312248 (+    60us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:10:10.312632 (+   384us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:10:10.312636 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:10:10.312638 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:10:10.312890 (+   252us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:10:10.313124 (+   234us) server_negotiation.cc:1036] Waiting for connection context
0504 14:10:10.313936 (+   812us) server_negotiation.cc:300] Negotiation successful
0504 14:10:10.314076 (+   140us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":4000,"thread_start_us":84,"threads_started":1}
I20260504 14:10:10.315658  6763 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:10.293213 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:37897 (local address 127.25.254.194:46051)
0504 14:10:10.298442 (+  5229us) negotiation.cc:107] Waiting for socket to connect
0504 14:10:10.298512 (+    70us) client_negotiation.cc:175] Beginning negotiation
0504 14:10:10.300100 (+  1588us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:10:10.302782 (+  2682us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:10:10.302796 (+    14us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:10:10.303217 (+   421us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:10:10.303978 (+   761us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:10:10.304002 (+    24us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.305082 (+  1080us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:10:10.305090 (+     8us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:10:10.305982 (+   892us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:10:10.305996 (+    14us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.306336 (+   340us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:10:10.307150 (+   814us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:10:10.307188 (+    38us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:10:10.309181 (+  1993us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:10:10.311635 (+  2454us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:10:10.311643 (+     8us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:10:10.311660 (+    17us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:10:10.312026 (+   366us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:10:10.312369 (+   343us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:10:10.312372 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:10:10.312374 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:10:10.312517 (+   143us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:10:10.313277 (+   760us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:10:10.313283 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:10:10.313616 (+   333us) client_negotiation.cc:770] Sending connection context
0504 14:10:10.314530 (+   914us) client_negotiation.cc:241] Negotiation successful
0504 14:10:10.314785 (+   255us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":4946,"thread_start_us":136,"threads_started":1}
I20260504 14:10:10.317066  6760 heartbeater.cc:344] Connected to a master server at 127.25.254.254:37897
I20260504 14:10:10.317361  6760 heartbeater.cc:461] Registering TS with master...
I20260504 14:10:10.318060  6760 heartbeater.cc:507] Master 127.25.254.254:37897 requested a full tablet report, sending...
I20260504 14:10:10.323689  6393 ts_manager.cc:194] Registered new tserver with Master: 93019787649c421e94cbf1022725e556 (127.25.254.194:35361)
I20260504 14:10:10.324396  6393 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.194@KRBTEST.COM'} at 127.25.254.194:46051
WARNING: no policy specified for kudu/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.195@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.195@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.195 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:10:10.442842 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--encryption_key_provider=ranger-kms
--encryption_cluster_key_name=kuduclusterkey
--ranger_kms_url=127.25.254.212:34499/kms
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.195:0
--local_ip_for_outbound_sockets=127.25.254.195
--webserver_interface=127.25.254.195
--webserver_port=0
--tserver_master_addrs=127.25.254.254:37897
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:44557
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:10:10.587623  6769 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:10:10.588059  6769 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:10:10.588203  6769 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:10:10.593815  6769 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:10:10.594033  6769 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:10:10.594257  6769 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.195
I20260504 14:10:10.602206  6769 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:44557
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--encryption_cluster_key_name=kuduclusterkey
--encryption_key_provider=ranger-kms
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-2/wal
--ranger_kms_url=127.25.254.212:34499/kms
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.195
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.195:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.25.254.195
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:37897
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.6769
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.195
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:10:10.604087  6769 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:10:10.605505  6769 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:10:10.615347  6777 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:10:10.616577  6774 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:10:10.617352  6769 server_base.cc:1061] running on GCE node
W20260504 14:10:10.617789  6775 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:10:10.618418  6769 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:10:10.619230  6769 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:10:10.620393  6769 hybrid_clock.cc:648] HybridClock initialized: now 1777903810620329 us; error 85 us; skew 500 ppm
May 04 14:10:10 dist-test-slave-2x32 krb5kdc[5621](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903810, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:10:10.624848  6769 init.cc:377] Logged in from keytab as kudu/127.25.254.195@KRBTEST.COM (short username kudu)
I20260504 14:10:10.626535  6769 webserver.cc:492] Webserver started at http://127.25.254.195:42377/ using document root <none> and password file <none>
I20260504 14:10:10.627449  6769 fs_manager.cc:362] Metadata directory not provided
I20260504 14:10:10.627604  6769 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:10:10.627961  6769 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
May 04 14:10:10 dist-test-slave-2x32 krb5kdc[5621](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903810, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for HTTP/127.25.254.212@KRBTEST.COM
14:10:10.635 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Got token null from httpRequest http://127.25.254.212:34499/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1
14:10:10.636 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:34499/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1] triggering authentication. handler: class org.apache.hadoop.security.token.delegation.web.KerberosDelegationTokenAuthenticationHandler
14:10:10.636 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationHandler -- Falling back to class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler (req=org.apache.catalina.connector.RequestFacade@7bd7bf46)
14:10:10.643 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:34499/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1] user [kudu] authenticated
14:10:10.648 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- ==> generateEncryptedKeys(name=kuduclusterkey, eekOp=generate, numKeys=1)
14:10:10.652 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.195@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:10.652 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:10.652 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:10.653 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.195@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:10.653 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(GENERATE_EEK, kudu/127.25.254.195@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey)
14:10:10.653 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.security.UserGroupInformation -- Failed to get groups for user kudu
java.io.IOException: No groups found for user kudu
	at org.apache.hadoop.security.Groups.noGroupsForUser(Groups.java:200)
	at org.apache.hadoop.security.Groups.getGroups(Groups.java:223)
	at org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1755)
	at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1743)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest.<init>(RangerKmsAuthorizer.java:367)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:247)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:266)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:164)
	at org.apache.hadoop.crypto.key.kms.server.KMS.assertAccess(KMS.java:745)
	at org.apache.hadoop.crypto.key.kms.server.KMS.generateEncryptedKeys(KMS.java:529)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:10.653 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } }, policyType=0)
14:10:10.653 [http-nio-34499-exec-1] INFO org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- RangerPolicyEngineImpl.evaluatePolicies(59ec5668_0, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:10.653 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- ==> preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:10.653 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:10:10.653 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:10:10.653 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- getMatchedZonesForResourceAndChildren(resource=RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:10:10.653 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- No context-enrichers!!!
14:10:10.653 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- <== preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:10.653 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0)
14:10:10.653 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- zoneNames:[null]
14:10:10.653 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- zoneName:[null]
14:10:10.653 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.setAuditEnabledFromCache()
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.setAuditEnabledFromCache():false
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-34499-exec-1:RangerResourceTrie.traverse(resource=kuduclusterkey):17613:18693
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@1f07ff9c
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@5a92a871]]
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-34499-exec-1:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):148150:151873
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=1
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:10 UTC 2026)
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:10 UTC 2026) : true
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- isAllValuesRequested(kuduclusterkey): false
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): true
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.policyresourcematcher.match -- [PERF]:http-nio-34499-exec-1:RangerDefaultPolicyResourceMatcher.getMatchType():245039:245406
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.matchPolicyCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.matchCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): true
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}, SELF)
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- Checking for accessType:[generateeek]
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70c9dc81)
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70c9dc81): null
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70c9dc81)
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70c9dc81)
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70c9dc81)
14:10:10.654 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null)
14:10:10.655 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null): false
14:10:10.655 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70c9dc81): false
14:10:10.658 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-34499-exec-1:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):158065:3498050
14:10:10.658 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70c9dc81): false
14:10:10.658 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70c9dc81)
14:10:10.660 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70c9dc81)
14:10:10.660 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null)
14:10:10.660 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null): true
14:10:10.660 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70c9dc81): true
14:10:10.660 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70c9dc81)
14:10:10.660 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70c9dc81): true
14:10:10.660 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-34499-exec-1:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):273810:272456
14:10:10.660 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70c9dc81): true
14:10:10.660 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70c9dc81): org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator@6cf7d42f
14:10:10.660 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:10.668 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:10.668 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{generateeek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF)
14:10:10.668 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.policy.request -- [PERF]:http-nio-34499-exec-1:RangerPolicyEvaluator.evaluate(requestHashCode=59ec5668,policyId=3, policyName=all):1589762:13826794
14:10:10.668 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{generateeek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.668 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.storeAuditEnabledInCache()
14:10:10.668 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.storeAuditEnabledInCache()
14:10:10.668 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:10.668 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:10.668 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:10.668 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.policyengine.request -- [PERF]:http-nio-34499-exec-1:RangerPolicyEngine.evaluatePolicies(requestHashCode=59ec5668_0):2523500:14759475
14:10:10.668 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType=0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:10.668 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.668 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.668 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:10:10.668 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:10:10.668 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-34499-exec-1:RangerResourceTrie.traverse(resource=kuduclusterkey):15753:16279
14:10:10.668 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@5eadffa2
14:10:10.668 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@33fedecb, org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@54f0041f]]
14:10:10.668 [http-nio-34499-exec-1] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-34499-exec-1:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):216233:215975
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=2
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchUserGroupRole(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=false
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- generateNextAuditEventId(): b6caadd3-a810-48a3-a13b-1ded6e6de8f1-6
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={b6caadd3-a810-48a3-a13b-1ded6e6de8f1-6} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:10 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-6;seq_num=12;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null}
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:10 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-6;seq_num=12;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:10 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-6;seq_num=13;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={b6caadd3-a810-48a3-a13b-1ded6e6de8f1-6} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.hasAccess(GENERATE_EEK, kudu/127.25.254.195@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey): true
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:10.669 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.195@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:10.670 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: kudu/127.25.254.195@KRBTEST.COM (auth:KERBEROS)][action: org.apache.hadoop.crypto.key.kms.server.KMS$$Lambda$240/0x00007f6bcc64e598@45925533]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.hadoop.crypto.key.kms.server.KMS.generateEncryptedKeys(KMS.java:530)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:10.671 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:10:10.672 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:10.672 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:10.672 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:10.672 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:10.672 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:10:10.672 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.195@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:10.672 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:10.672 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:10.672 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.195@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:10.672 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:10.672 [http-nio-34499-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:10.672 [http-nio-34499-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.195@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:10:10.672 [http-nio-34499-exec-1] INFO kms-audit -- OK[op=GENERATE_EEK, key=kuduclusterkey, user=kudu/127.25.254.195@KRBTEST.COM, accessCount=1, interval=0ms] 
14:10:10.672 [http-nio-34499-exec-1] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- <== generateEncryptedKeys(name=kuduclusterkey, eekOp=generate, numKeys=1)
I20260504 14:10:10.678905  6769 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-2/data/instance:
uuid: "c5d5ef2e3b7d49429fd43768e4e3ae9a"
format_stamp: "Formatted at 2026-05-04 14:10:10 on dist-test-slave-2x32"
server_key: "b7fb19f2161cf405c395a911ea43811a"
server_key_iv: "b5c127f3d3cbf2df884e6c38330dd001"
server_key_version: "kuduclusterkey@0"
I20260504 14:10:10.679602  6769 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance:
uuid: "c5d5ef2e3b7d49429fd43768e4e3ae9a"
format_stamp: "Formatted at 2026-05-04 14:10:10 on dist-test-slave-2x32"
server_key: "b7fb19f2161cf405c395a911ea43811a"
server_key_iv: "b5c127f3d3cbf2df884e6c38330dd001"
server_key_version: "kuduclusterkey@0"
I20260504 14:10:10.695815  6769 fs_manager.cc:696] Time spent creating directory manager: real 0.016s	user 0.005s	sys 0.000s
14:10:10.698 [http-nio-34499-exec-2] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Got token null from httpRequest http://127.25.254.212:34499/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt
14:10:10.699 [http-nio-34499-exec-2] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:34499/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt] triggering authentication. handler: class org.apache.hadoop.security.token.delegation.web.KerberosDelegationTokenAuthenticationHandler
14:10:10.699 [http-nio-34499-exec-2] DEBUG org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationHandler -- Falling back to class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler (req=org.apache.catalina.connector.RequestFacade@412d9ec2)
14:10:10.703 [http-nio-34499-exec-2] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:34499/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt] user [kudu] authenticated
14:10:10.707 [http-nio-34499-exec-2] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- ==> handleEncryptedKeyOp(versionName=kuduclusterkey@0, eekOp=decrypt)
14:10:10.707 [http-nio-34499-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.195@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:10.707 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:10.707 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:10.707 [http-nio-34499-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.195@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:10.707 [http-nio-34499-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(DECRYPT_EEK, kudu/127.25.254.195@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey)
14:10:10.707 [http-nio-34499-exec-2] DEBUG org.apache.hadoop.security.UserGroupInformation -- Failed to get groups for user kudu
java.io.IOException: No groups found for user kudu
	at org.apache.hadoop.security.Groups.noGroupsForUser(Groups.java:200)
	at org.apache.hadoop.security.Groups.getGroups(Groups.java:223)
	at org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1755)
	at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1743)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest.<init>(RangerKmsAuthorizer.java:367)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:247)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:266)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:164)
	at org.apache.hadoop.crypto.key.kms.server.KMS.assertAccess(KMS.java:745)
	at org.apache.hadoop.crypto.key.kms.server.KMS.handleEncryptedKeyOp(KMS.java:664)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } }, policyType=0)
14:10:10.708 [http-nio-34499-exec-2] INFO org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- RangerPolicyEngineImpl.evaluatePolicies(2321263a_0, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- ==> preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- getMatchedZonesForResourceAndChildren(resource=RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- No context-enrichers!!!
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- <== preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0)
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- zoneNames:[null]
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- zoneName:[null]
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.setAuditEnabledFromCache()
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.setAuditEnabledFromCache():false
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-34499-exec-2:RangerResourceTrie.traverse(resource=kuduclusterkey):13547:14240
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@3f134c19
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@5a92a871]]
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-34499-exec-2:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):105381:105337
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=1
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:10 UTC 2026)
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:10:10 UTC 2026) : true
14:10:10.708 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- isAllValuesRequested(kuduclusterkey): false
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): true
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.perf.policyresourcematcher.match -- [PERF]:http-nio-34499-exec-2:RangerDefaultPolicyResourceMatcher.getMatchType():240695:241340
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.matchPolicyCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.matchCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): true
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}, SELF)
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- Checking for accessType:[decrypteek]
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@5b0f562c)
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@5b0f562c): null
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@5b0f562c)
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@5b0f562c)
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@5b0f562c)
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null)
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null): false
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@5b0f562c): false
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-34499-exec-2:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):69118:69372
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@5b0f562c): false
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@5b0f562c)
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@5b0f562c)
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null)
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null): true
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@5b0f562c): true
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@5b0f562c)
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@5b0f562c): true
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-34499-exec-2:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):77204:77547
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@5b0f562c): true
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@5b0f562c): org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator@6cf7d42f
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{decrypteek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF)
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.perf.policy.request -- [PERF]:http-nio-34499-exec-2:RangerPolicyEvaluator.evaluate(requestHashCode=2321263a,policyId=3, policyName=all):821341:821508
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{decrypteek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.storeAuditEnabledInCache()
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.storeAuditEnabledInCache()
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.perf.policyengine.request -- [PERF]:http-nio-34499-exec-2:RangerPolicyEngine.evaluatePolicies(requestHashCode=2321263a_0):1511971:1510988
14:10:10.709 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType=0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-34499-exec-2:RangerResourceTrie.traverse(resource=kuduclusterkey):12272:12934
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@6ef89de3
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@33fedecb, org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@54f0041f]]
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-34499-exec-2:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):238476:238957
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=2
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchUserGroupRole(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=false
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:10:10 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- generateNextAuditEventId(): b6caadd3-a810-48a3-a13b-1ded6e6de8f1-7
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={b6caadd3-a810-48a3-a13b-1ded6e6de8f1-7} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:10 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-7;seq_num=14;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null}
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:10 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-7;seq_num=14;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:10:10 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=b6caadd3-a810-48a3-a13b-1ded6e6de8f1-7;seq_num=15;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:10:10.710 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={b6caadd3-a810-48a3-a13b-1ded6e6de8f1-7} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:10:10.711 [http-nio-34499-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.hasAccess(DECRYPT_EEK, kudu/127.25.254.195@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey): true
14:10:10.711 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:10.711 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:10.711 [http-nio-34499-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.195@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:10.711 [http-nio-34499-exec-2] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: kudu/127.25.254.195@KRBTEST.COM (auth:KERBEROS)][action: org.apache.hadoop.crypto.key.kms.server.KMS$$Lambda$241/0x00007f6bcc654208@619499e5]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.hadoop.crypto.key.kms.server.KMS.handleEncryptedKeyOp(KMS.java:666)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:10.711 [http-nio-34499-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:10:10.711 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:10.711 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:10.711 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:10.711 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:10.711 [http-nio-34499-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:10:10.711 [http-nio-34499-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.195@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:10.711 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:10:10.711 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:10:10.711 [http-nio-34499-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.195@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:10.711 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:10:10.711 [http-nio-34499-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:10:10.711 [http-nio-34499-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.195@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:10:10.711 [http-nio-34499-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:10:10.712 [http-nio-34499-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:10:10.712 [http-nio-34499-exec-2] INFO kms-audit -- OK[op=DECRYPT_EEK, key=kuduclusterkey, user=kudu/127.25.254.195@KRBTEST.COM, accessCount=1, interval=0ms] 
14:10:10.712 [http-nio-34499-exec-2] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- <== handleEncryptedKeyOp(versionName=kuduclusterkey@0, eekOp=decrypt)
I20260504 14:10:10.722646  6784 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:10:10.737663  6769 fs_manager.cc:730] Time spent opening block manager: real 0.021s	user 0.002s	sys 0.000s
I20260504 14:10:10.737839  6769 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-2/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-2/wal
uuid: "c5d5ef2e3b7d49429fd43768e4e3ae9a"
format_stamp: "Formatted at 2026-05-04 14:10:10 on dist-test-slave-2x32"
server_key: "b7fb19f2161cf405c395a911ea43811a"
server_key_iv: "b5c127f3d3cbf2df884e6c38330dd001"
server_key_version: "kuduclusterkey@0"
I20260504 14:10:10.737977  6769 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:10:10.763212  6769 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:10:10.772789  6769 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:10:10.773008  6769 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:10:10.773726  6769 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:10:10.774919  6769 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:10:10.774991  6769 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:10:10.775058  6769 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:10:10.775084  6769 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:10:10.779851  6621 heartbeater.cc:499] Master 127.25.254.254:37897 was elected leader, sending a full tablet report...
I20260504 14:10:10.789299  6769 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.195:40163
I20260504 14:10:10.790735  6769 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-2/data/info.pb
I20260504 14:10:10.796844 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 6769
I20260504 14:10:10.796984 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-2/wal/instance
I20260504 14:10:10.797300 26619 external_mini_cluster.cc:1468] Setting key 9dd133d83c36de2fe9bf833bc069ab30
May 04 14:10:10 dist-test-slave-2x32 krb5kdc[5621](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903810, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.195@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:10:10.819918  6764 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:10.806272 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:55081 (local address 127.25.254.254:37897)
0504 14:10:10.806428 (+   156us) server_negotiation.cc:207] Beginning negotiation
0504 14:10:10.806433 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:10:10.806933 (+   500us) server_negotiation.cc:408] Connection header received
0504 14:10:10.808176 (+  1243us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:10:10.808183 (+     7us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:10:10.808254 (+    71us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:10:10.808320 (+    66us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:10:10.810052 (+  1732us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.810858 (+   806us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:10:10.811692 (+   834us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.811891 (+   199us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:10:10.814896 (+  3005us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:10:10.814915 (+    19us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:10:10.814918 (+     3us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:10:10.814947 (+    29us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:10:10.816712 (+  1765us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:10:10.818131 (+  1419us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:10:10.818135 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:10:10.818137 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:10:10.818223 (+    86us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:10:10.818585 (+   362us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:10:10.818589 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:10:10.818591 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:10:10.818775 (+   184us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:10:10.818846 (+    71us) server_negotiation.cc:1036] Waiting for connection context
0504 14:10:10.819571 (+   725us) server_negotiation.cc:300] Negotiation successful
0504 14:10:10.819711 (+   140us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":41}
I20260504 14:10:10.801481  6897 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.195:40163 every 8 connection(s)
I20260504 14:10:10.821744  6900 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:10.805050 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:37897 (local address 127.25.254.195:55081)
0504 14:10:10.806780 (+  1730us) negotiation.cc:107] Waiting for socket to connect
0504 14:10:10.806831 (+    51us) client_negotiation.cc:175] Beginning negotiation
0504 14:10:10.807911 (+  1080us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:10:10.808800 (+   889us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:10:10.808811 (+    11us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:10:10.809184 (+   373us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:10:10.809872 (+   688us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:10:10.809889 (+    17us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.811023 (+  1134us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:10:10.811026 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:10:10.811561 (+   535us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:10:10.811571 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.812125 (+   554us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:10:10.812809 (+   684us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:10:10.812834 (+    25us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:10:10.814711 (+  1877us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:10:10.817623 (+  2912us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:10:10.817630 (+     7us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:10:10.817644 (+    14us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:10:10.818000 (+   356us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:10:10.818335 (+   335us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:10:10.818338 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:10:10.818340 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:10:10.818478 (+   138us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:10:10.818926 (+   448us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:10:10.818932 (+     6us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:10:10.819245 (+   313us) client_negotiation.cc:770] Sending connection context
0504 14:10:10.820526 (+  1281us) client_negotiation.cc:241] Negotiation successful
0504 14:10:10.820780 (+   254us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":1527,"thread_start_us":158,"threads_started":1}
I20260504 14:10:10.825134  6898 heartbeater.cc:344] Connected to a master server at 127.25.254.254:37897
I20260504 14:10:10.825419  6898 heartbeater.cc:461] Registering TS with master...
I20260504 14:10:10.826018  6898 heartbeater.cc:507] Master 127.25.254.254:37897 requested a full tablet report, sending...
I20260504 14:10:10.827226  6393 ts_manager.cc:194] Registered new tserver with Master: c5d5ef2e3b7d49429fd43768e4e3ae9a (127.25.254.195:40163)
I20260504 14:10:10.827966  6393 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.195@KRBTEST.COM'} at 127.25.254.195:55081
I20260504 14:10:10.836735 26619 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
I20260504 14:10:10.878716  6764 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:10.849272 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:40320 (local address 127.25.254.254:37897)
0504 14:10:10.849426 (+   154us) server_negotiation.cc:207] Beginning negotiation
0504 14:10:10.849431 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:10:10.860445 (+ 11014us) server_negotiation.cc:408] Connection header received
0504 14:10:10.860691 (+   246us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:10:10.860695 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:10:10.860759 (+    64us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:10:10.860815 (+    56us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:10:10.863972 (+  3157us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.864872 (+   900us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:10:10.866669 (+  1797us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.866875 (+   206us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:10:10.871654 (+  4779us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:10:10.871683 (+    29us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:10:10.871687 (+     4us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:10:10.871727 (+    40us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:10:10.873599 (+  1872us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:10:10.874199 (+   600us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:10:10.874204 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:10:10.874206 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:10:10.874270 (+    64us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:10:10.874584 (+   314us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:10:10.874587 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:10:10.874589 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:10:10.874773 (+   184us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:10:10.874886 (+   113us) server_negotiation.cc:1036] Waiting for connection context
0504 14:10:10.878366 (+  3480us) server_negotiation.cc:300] Negotiation successful
0504 14:10:10.878549 (+   183us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":59}
I20260504 14:10:10.882885  6393 catalog_manager.cc:2257] Servicing CreateTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:40320:
name: "test-table"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "val"
    type: INT32
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
W20260504 14:10:10.886541  6393 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-table in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260504 14:10:10.990562  6917 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:10.961334 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.195:40163 (local address 127.0.0.1:34682)
0504 14:10:10.982068 (+ 20734us) negotiation.cc:107] Waiting for socket to connect
0504 14:10:10.982080 (+    12us) client_negotiation.cc:175] Beginning negotiation
0504 14:10:10.982147 (+    67us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:10:10.982995 (+   848us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:10:10.983001 (+     6us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:10:10.983024 (+    23us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:10:10.983337 (+   313us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:10:10.983346 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.989002 (+  5656us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:10:10.989007 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:10:10.989956 (+   949us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:10:10.989966 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.990094 (+   128us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:10:10.990196 (+   102us) client_negotiation.cc:770] Sending connection context
0504 14:10:10.990301 (+   105us) client_negotiation.cc:241] Negotiation successful
0504 14:10:10.990384 (+    83us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":20673,"thread_start_us":95,"threads_started":1}
I20260504 14:10:10.998723  6916 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:10.961118 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:54626 (local address 127.25.254.193:38475)
0504 14:10:10.971370 (+ 10252us) server_negotiation.cc:207] Beginning negotiation
0504 14:10:10.971377 (+     7us) server_negotiation.cc:400] Waiting for connection header
0504 14:10:10.981893 (+ 10516us) server_negotiation.cc:408] Connection header received
0504 14:10:10.981999 (+   106us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:10:10.982005 (+     6us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:10:10.982259 (+   254us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:10:10.982411 (+   152us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:10:10.994696 (+ 12285us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.995865 (+  1169us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:10:10.997325 (+  1460us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.997959 (+   634us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:10:10.998112 (+   153us) server_negotiation.cc:1036] Waiting for connection context
0504 14:10:10.998441 (+   329us) server_negotiation.cc:300] Negotiation successful
0504 14:10:10.998549 (+   108us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":10109,"thread_start_us":74,"threads_started":1}
I20260504 14:10:10.999287  6918 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:10.961741 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:34682 (local address 127.25.254.195:40163)
0504 14:10:10.982443 (+ 20702us) server_negotiation.cc:207] Beginning negotiation
0504 14:10:10.982449 (+     6us) server_negotiation.cc:400] Waiting for connection header
0504 14:10:10.982470 (+    21us) server_negotiation.cc:408] Connection header received
0504 14:10:10.982547 (+    77us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:10:10.982552 (+     5us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:10:10.982696 (+   144us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:10:10.982830 (+   134us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:10:10.986310 (+  3480us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.987500 (+  1190us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:10:10.998280 (+ 10780us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.998864 (+   584us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:10:10.998973 (+   109us) server_negotiation.cc:1036] Waiting for connection context
0504 14:10:10.999044 (+    71us) server_negotiation.cc:300] Negotiation successful
0504 14:10:10.999127 (+    83us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":20571,"thread_start_us":116,"threads_started":1}
I20260504 14:10:10.993438  6913 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:10.957865 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:35361 (local address 127.0.0.1:48938)
0504 14:10:10.980504 (+ 22639us) negotiation.cc:107] Waiting for socket to connect
0504 14:10:10.980555 (+    51us) client_negotiation.cc:175] Beginning negotiation
0504 14:10:10.980709 (+   154us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:10:10.985830 (+  5121us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:10:10.985834 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:10:10.985853 (+    19us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:10:10.986125 (+   272us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:10:10.986133 (+     8us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.991012 (+  4879us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:10:10.991016 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:10:10.993012 (+  1996us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:10:10.993022 (+    10us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.993136 (+   114us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:10:10.993155 (+    19us) client_negotiation.cc:770] Sending connection context
0504 14:10:10.993207 (+    52us) client_negotiation.cc:241] Negotiation successful
0504 14:10:10.993269 (+    62us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":22406,"thread_start_us":131,"threads_started":1}
I20260504 14:10:11.005280  6832 tablet_service.cc:1511] Processing CreateTablet for tablet cabb9032c1684ac19c8fb9b879df513e (DEFAULT_TABLE table=test-table [id=dffbd474f1ae46afbaedc44c012d0489]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:10:10.999679  6915 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:10.960946 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:38475 (local address 127.0.0.1:54626)
0504 14:10:10.981762 (+ 20816us) negotiation.cc:107] Waiting for socket to connect
0504 14:10:10.981777 (+    15us) client_negotiation.cc:175] Beginning negotiation
0504 14:10:10.981883 (+   106us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:10:10.994232 (+ 12349us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:10:10.994236 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:10:10.994266 (+    30us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:10:10.994526 (+   260us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:10:10.994533 (+     7us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.996030 (+  1497us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:10:10.996033 (+     3us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:10:10.997126 (+  1093us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:10:10.997138 (+    12us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.998227 (+  1089us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:10:10.998245 (+    18us) client_negotiation.cc:770] Sending connection context
0504 14:10:10.999284 (+  1039us) client_negotiation.cc:241] Negotiation successful
0504 14:10:10.999534 (+   250us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":20738,"thread_start_us":104,"threads_started":1}
I20260504 14:10:11.006609  6832 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet cabb9032c1684ac19c8fb9b879df513e. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:10:11.007706  6914 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:10.958882 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:48938 (local address 127.25.254.194:35361)
0504 14:10:10.981004 (+ 22122us) server_negotiation.cc:207] Beginning negotiation
0504 14:10:10.981012 (+     8us) server_negotiation.cc:400] Waiting for connection header
0504 14:10:10.981037 (+    25us) server_negotiation.cc:408] Connection header received
0504 14:10:10.981126 (+    89us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:10:10.981132 (+     6us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:10:10.981344 (+   212us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:10:10.981503 (+   159us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:10:10.987675 (+  6172us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:10.988850 (+  1175us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:10:10.999531 (+ 10681us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:11.000073 (+   542us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:10:11.000173 (+   100us) server_negotiation.cc:1036] Waiting for connection context
0504 14:10:11.000247 (+    74us) server_negotiation.cc:300] Negotiation successful
0504 14:10:11.000330 (+    83us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":22002,"thread_start_us":96,"threads_started":1}
I20260504 14:10:11.009753  6554 tablet_service.cc:1511] Processing CreateTablet for tablet cabb9032c1684ac19c8fb9b879df513e (DEFAULT_TABLE table=test-table [id=dffbd474f1ae46afbaedc44c012d0489]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:10:11.010946  6554 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet cabb9032c1684ac19c8fb9b879df513e. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:10:11.010851  6694 tablet_service.cc:1511] Processing CreateTablet for tablet cabb9032c1684ac19c8fb9b879df513e (DEFAULT_TABLE table=test-table [id=dffbd474f1ae46afbaedc44c012d0489]), partition=RANGE (key) PARTITION UNBOUNDED
I20260504 14:10:11.011906  6694 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet cabb9032c1684ac19c8fb9b879df513e. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:10:11.026830  6919 tablet_bootstrap.cc:492] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a: Bootstrap starting.
I20260504 14:10:11.029605  6919 tablet_bootstrap.cc:654] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a: Neither blocks nor log segments found. Creating new log.
I20260504 14:10:11.030261  6921 tablet_bootstrap.cc:492] T cabb9032c1684ac19c8fb9b879df513e P 5967e8b7dd55430493459214df9f79fa: Bootstrap starting.
I20260504 14:10:11.031036  6919 log.cc:826] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a: Log is configured to *not* fsync() on all Append() calls
I20260504 14:10:11.034607  6920 tablet_bootstrap.cc:492] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556: Bootstrap starting.
I20260504 14:10:11.037117  6920 tablet_bootstrap.cc:654] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556: Neither blocks nor log segments found. Creating new log.
I20260504 14:10:11.033082  6921 tablet_bootstrap.cc:654] T cabb9032c1684ac19c8fb9b879df513e P 5967e8b7dd55430493459214df9f79fa: Neither blocks nor log segments found. Creating new log.
I20260504 14:10:11.039300  6921 log.cc:826] T cabb9032c1684ac19c8fb9b879df513e P 5967e8b7dd55430493459214df9f79fa: Log is configured to *not* fsync() on all Append() calls
I20260504 14:10:11.040530  6919 tablet_bootstrap.cc:492] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a: No bootstrap required, opened a new log
I20260504 14:10:11.040735  6920 log.cc:826] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556: Log is configured to *not* fsync() on all Append() calls
I20260504 14:10:11.040781  6919 ts_tablet_manager.cc:1403] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a: Time spent bootstrapping tablet: real 0.014s	user 0.005s	sys 0.000s
I20260504 14:10:11.044658  6919 raft_consensus.cc:359] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "93019787649c421e94cbf1022725e556" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35361 } } peers { permanent_uuid: "c5d5ef2e3b7d49429fd43768e4e3ae9a" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 40163 } } peers { permanent_uuid: "5967e8b7dd55430493459214df9f79fa" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38475 } }
I20260504 14:10:11.044932  6919 raft_consensus.cc:385] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:10:11.044996  6919 raft_consensus.cc:740] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: c5d5ef2e3b7d49429fd43768e4e3ae9a, State: Initialized, Role: FOLLOWER
I20260504 14:10:11.045539  6919 consensus_queue.cc:260] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "93019787649c421e94cbf1022725e556" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35361 } } peers { permanent_uuid: "c5d5ef2e3b7d49429fd43768e4e3ae9a" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 40163 } } peers { permanent_uuid: "5967e8b7dd55430493459214df9f79fa" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38475 } }
I20260504 14:10:11.046626  6919 ts_tablet_manager.cc:1434] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a: Time spent starting tablet: real 0.006s	user 0.007s	sys 0.000s
I20260504 14:10:11.048087  6921 tablet_bootstrap.cc:492] T cabb9032c1684ac19c8fb9b879df513e P 5967e8b7dd55430493459214df9f79fa: No bootstrap required, opened a new log
I20260504 14:10:11.048386  6921 ts_tablet_manager.cc:1403] T cabb9032c1684ac19c8fb9b879df513e P 5967e8b7dd55430493459214df9f79fa: Time spent bootstrapping tablet: real 0.018s	user 0.006s	sys 0.000s
I20260504 14:10:11.052254  6921 raft_consensus.cc:359] T cabb9032c1684ac19c8fb9b879df513e P 5967e8b7dd55430493459214df9f79fa [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "93019787649c421e94cbf1022725e556" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35361 } } peers { permanent_uuid: "c5d5ef2e3b7d49429fd43768e4e3ae9a" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 40163 } } peers { permanent_uuid: "5967e8b7dd55430493459214df9f79fa" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38475 } }
I20260504 14:10:11.052634  6921 raft_consensus.cc:385] T cabb9032c1684ac19c8fb9b879df513e P 5967e8b7dd55430493459214df9f79fa [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:10:11.052735  6921 raft_consensus.cc:740] T cabb9032c1684ac19c8fb9b879df513e P 5967e8b7dd55430493459214df9f79fa [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5967e8b7dd55430493459214df9f79fa, State: Initialized, Role: FOLLOWER
I20260504 14:10:11.053345  6898 heartbeater.cc:499] Master 127.25.254.254:37897 was elected leader, sending a full tablet report...
I20260504 14:10:11.053339  6921 consensus_queue.cc:260] T cabb9032c1684ac19c8fb9b879df513e P 5967e8b7dd55430493459214df9f79fa [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "93019787649c421e94cbf1022725e556" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35361 } } peers { permanent_uuid: "c5d5ef2e3b7d49429fd43768e4e3ae9a" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 40163 } } peers { permanent_uuid: "5967e8b7dd55430493459214df9f79fa" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38475 } }
I20260504 14:10:11.054720  6921 ts_tablet_manager.cc:1434] T cabb9032c1684ac19c8fb9b879df513e P 5967e8b7dd55430493459214df9f79fa: Time spent starting tablet: real 0.006s	user 0.005s	sys 0.000s
W20260504 14:10:11.057225  6899 tablet.cc:2404] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20260504 14:10:11.067940  6920 tablet_bootstrap.cc:492] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556: No bootstrap required, opened a new log
I20260504 14:10:11.094468  6920 ts_tablet_manager.cc:1403] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556: Time spent bootstrapping tablet: real 0.060s	user 0.000s	sys 0.006s
I20260504 14:10:11.108826  6920 raft_consensus.cc:359] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "93019787649c421e94cbf1022725e556" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35361 } } peers { permanent_uuid: "c5d5ef2e3b7d49429fd43768e4e3ae9a" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 40163 } } peers { permanent_uuid: "5967e8b7dd55430493459214df9f79fa" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38475 } }
I20260504 14:10:11.114923  6920 raft_consensus.cc:385] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:10:11.116709  6920 raft_consensus.cc:740] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 93019787649c421e94cbf1022725e556, State: Initialized, Role: FOLLOWER
I20260504 14:10:11.117565  6920 consensus_queue.cc:260] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "93019787649c421e94cbf1022725e556" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35361 } } peers { permanent_uuid: "c5d5ef2e3b7d49429fd43768e4e3ae9a" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 40163 } } peers { permanent_uuid: "5967e8b7dd55430493459214df9f79fa" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38475 } }
I20260504 14:10:11.120098  6920 ts_tablet_manager.cc:1434] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556: Time spent starting tablet: real 0.025s	user 0.000s	sys 0.006s
I20260504 14:10:11.120582  6760 heartbeater.cc:499] Master 127.25.254.254:37897 was elected leader, sending a full tablet report...
W20260504 14:10:11.234436  6622 tablet.cc:2404] T cabb9032c1684ac19c8fb9b879df513e P 5967e8b7dd55430493459214df9f79fa: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20260504 14:10:11.244776  6926 raft_consensus.cc:493] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260504 14:10:11.245000  6926 raft_consensus.cc:515] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "93019787649c421e94cbf1022725e556" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35361 } } peers { permanent_uuid: "c5d5ef2e3b7d49429fd43768e4e3ae9a" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 40163 } } peers { permanent_uuid: "5967e8b7dd55430493459214df9f79fa" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38475 } }
I20260504 14:10:11.246313  6926 leader_election.cc:290] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 93019787649c421e94cbf1022725e556 (127.25.254.194:35361), 5967e8b7dd55430493459214df9f79fa (127.25.254.193:38475)
I20260504 14:10:11.250069  6900 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:11.246748 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:38475 (local address 127.25.254.195:39573)
0504 14:10:11.246900 (+   152us) negotiation.cc:107] Waiting for socket to connect
0504 14:10:11.246919 (+    19us) client_negotiation.cc:175] Beginning negotiation
0504 14:10:11.247037 (+   118us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:10:11.247640 (+   603us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:10:11.247643 (+     3us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:10:11.247663 (+    20us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:10:11.247905 (+   242us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:10:11.247911 (+     6us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:10:11.248880 (+   969us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:10:11.248884 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:10:11.249703 (+   819us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:10:11.249712 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:10:11.249825 (+   113us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:10:11.249843 (+    18us) client_negotiation.cc:770] Sending connection context
0504 14:10:11.249889 (+    46us) client_negotiation.cc:241] Negotiation successful
0504 14:10:11.249943 (+    54us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":41}
I20260504 14:10:11.251646  6916 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:11.247159 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:39573 (local address 127.25.254.193:38475)
0504 14:10:11.247288 (+   129us) server_negotiation.cc:207] Beginning negotiation
0504 14:10:11.247295 (+     7us) server_negotiation.cc:400] Waiting for connection header
0504 14:10:11.247313 (+    18us) server_negotiation.cc:408] Connection header received
0504 14:10:11.247372 (+    59us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:10:11.247375 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:10:11.247439 (+    64us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:10:11.247522 (+    83us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:10:11.248035 (+   513us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:11.248746 (+   711us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:10:11.250892 (+  2146us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:11.251372 (+   480us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:10:11.251421 (+    49us) server_negotiation.cc:1036] Waiting for connection context
0504 14:10:11.251473 (+    52us) server_negotiation.cc:300] Negotiation successful
0504 14:10:11.251528 (+    55us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":33}
I20260504 14:10:11.252254  6575 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "cabb9032c1684ac19c8fb9b879df513e" candidate_uuid: "c5d5ef2e3b7d49429fd43768e4e3ae9a" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "5967e8b7dd55430493459214df9f79fa" is_pre_election: true
I20260504 14:10:11.252542  6575 raft_consensus.cc:2468] T cabb9032c1684ac19c8fb9b879df513e P 5967e8b7dd55430493459214df9f79fa [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate c5d5ef2e3b7d49429fd43768e4e3ae9a in term 0.
I20260504 14:10:11.253314  6786 leader_election.cc:304] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 5967e8b7dd55430493459214df9f79fa, c5d5ef2e3b7d49429fd43768e4e3ae9a; no voters: 
I20260504 14:10:11.253602  6926 raft_consensus.cc:2804] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260504 14:10:11.253659  6926 raft_consensus.cc:493] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260504 14:10:11.253696  6926 raft_consensus.cc:3060] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:10:11.254984  6926 raft_consensus.cc:515] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "93019787649c421e94cbf1022725e556" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35361 } } peers { permanent_uuid: "c5d5ef2e3b7d49429fd43768e4e3ae9a" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 40163 } } peers { permanent_uuid: "5967e8b7dd55430493459214df9f79fa" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38475 } }
I20260504 14:10:11.255371  6926 leader_election.cc:290] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [CANDIDATE]: Term 1 election: Requested vote from peers 93019787649c421e94cbf1022725e556 (127.25.254.194:35361), 5967e8b7dd55430493459214df9f79fa (127.25.254.193:38475)
I20260504 14:10:11.256034  6575 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "cabb9032c1684ac19c8fb9b879df513e" candidate_uuid: "c5d5ef2e3b7d49429fd43768e4e3ae9a" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "5967e8b7dd55430493459214df9f79fa"
I20260504 14:10:11.256160  6575 raft_consensus.cc:3060] T cabb9032c1684ac19c8fb9b879df513e P 5967e8b7dd55430493459214df9f79fa [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:10:11.257411  6575 raft_consensus.cc:2468] T cabb9032c1684ac19c8fb9b879df513e P 5967e8b7dd55430493459214df9f79fa [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate c5d5ef2e3b7d49429fd43768e4e3ae9a in term 1.
I20260504 14:10:11.257841  6786 leader_election.cc:304] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 5967e8b7dd55430493459214df9f79fa, c5d5ef2e3b7d49429fd43768e4e3ae9a; no voters: 
I20260504 14:10:11.258067  6926 raft_consensus.cc:2804] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:10:11.258318  6926 raft_consensus.cc:697] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [term 1 LEADER]: Becoming Leader. State: Replica: c5d5ef2e3b7d49429fd43768e4e3ae9a, State: Running, Role: LEADER
I20260504 14:10:11.258653  6926 consensus_queue.cc:237] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "93019787649c421e94cbf1022725e556" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35361 } } peers { permanent_uuid: "c5d5ef2e3b7d49429fd43768e4e3ae9a" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 40163 } } peers { permanent_uuid: "5967e8b7dd55430493459214df9f79fa" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38475 } }
I20260504 14:10:11.261935  6900 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:11.247407 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.194:35361 (local address 127.25.254.195:56555)
0504 14:10:11.250279 (+  2872us) negotiation.cc:107] Waiting for socket to connect
0504 14:10:11.250293 (+    14us) client_negotiation.cc:175] Beginning negotiation
0504 14:10:11.250373 (+    80us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:10:11.259343 (+  8970us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:10:11.259347 (+     4us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:10:11.259374 (+    27us) client_negotiation.cc:190] Negotiated authn=CERTIFICATE
0504 14:10:11.259639 (+   265us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:10:11.259648 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:10:11.260702 (+  1054us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:10:11.260706 (+     4us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:10:11.261541 (+   835us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:10:11.261550 (+     9us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:10:11.261663 (+   113us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:10:11.261681 (+    18us) client_negotiation.cc:770] Sending connection context
0504 14:10:11.261730 (+    49us) client_negotiation.cc:241] Negotiation successful
0504 14:10:11.261794 (+    64us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":2809,"thread_start_us":77,"threads_started":1}
I20260504 14:10:11.262914  6914 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:11.247528 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.195:56555 (local address 127.25.254.194:35361)
0504 14:10:11.247644 (+   116us) server_negotiation.cc:207] Beginning negotiation
0504 14:10:11.247648 (+     4us) server_negotiation.cc:400] Waiting for connection header
0504 14:10:11.250577 (+  2929us) server_negotiation.cc:408] Connection header received
0504 14:10:11.250639 (+    62us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:10:11.250642 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:10:11.250706 (+    64us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:10:11.250777 (+    71us) server_negotiation.cc:227] Negotiated authn=CERTIFICATE
0504 14:10:11.259811 (+  9034us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:11.260559 (+   748us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:10:11.262212 (+  1653us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:11.262660 (+   448us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:10:11.262696 (+    36us) server_negotiation.cc:1036] Waiting for connection context
0504 14:10:11.262748 (+    52us) server_negotiation.cc:300] Negotiation successful
0504 14:10:11.262801 (+    53us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":35}
I20260504 14:10:11.266649  6392 catalog_manager.cc:5671] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a reported cstate change: term changed from 0 to 1, leader changed from <none> to c5d5ef2e3b7d49429fd43768e4e3ae9a (127.25.254.195). New cstate: current_term: 1 leader_uuid: "c5d5ef2e3b7d49429fd43768e4e3ae9a" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "93019787649c421e94cbf1022725e556" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35361 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "c5d5ef2e3b7d49429fd43768e4e3ae9a" member_type: VOTER last_known_addr { host: "127.25.254.195" port: 40163 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "5967e8b7dd55430493459214df9f79fa" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38475 } health_report { overall_health: UNKNOWN } } }
I20260504 14:10:11.269016  6695 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "cabb9032c1684ac19c8fb9b879df513e" candidate_uuid: "c5d5ef2e3b7d49429fd43768e4e3ae9a" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "93019787649c421e94cbf1022725e556" is_pre_election: true
I20260504 14:10:11.269323  6695 raft_consensus.cc:2468] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate c5d5ef2e3b7d49429fd43768e4e3ae9a in term 0.
I20260504 14:10:11.269688  6696 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "cabb9032c1684ac19c8fb9b879df513e" candidate_uuid: "c5d5ef2e3b7d49429fd43768e4e3ae9a" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "93019787649c421e94cbf1022725e556"
I20260504 14:10:11.269783  6696 raft_consensus.cc:3060] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:10:11.271091  6696 raft_consensus.cc:2468] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate c5d5ef2e3b7d49429fd43768e4e3ae9a in term 1.
W20260504 14:10:11.295646  6761 tablet.cc:2404] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20260504 14:10:11.350803  6918 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:11.346522 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:34692 (local address 127.25.254.195:40163)
0504 14:10:11.346660 (+   138us) server_negotiation.cc:207] Beginning negotiation
0504 14:10:11.346665 (+     5us) server_negotiation.cc:400] Waiting for connection header
0504 14:10:11.346679 (+    14us) server_negotiation.cc:408] Connection header received
0504 14:10:11.346775 (+    96us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:10:11.346779 (+     4us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:10:11.346840 (+    61us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:10:11.346934 (+    94us) server_negotiation.cc:227] Negotiated authn=TOKEN
0504 14:10:11.347504 (+   570us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:11.348244 (+   740us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:10:11.349263 (+  1019us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:10:11.349469 (+   206us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:10:11.349572 (+   103us) server_negotiation.cc:366] Received TOKEN_EXCHANGE NegotiatePB request
0504 14:10:11.350068 (+   496us) server_negotiation.cc:378] Sending TOKEN_EXCHANGE NegotiatePB response
0504 14:10:11.350583 (+   515us) server_negotiation.cc:1036] Waiting for connection context
0504 14:10:11.350616 (+    33us) server_negotiation.cc:300] Negotiation successful
0504 14:10:11.350664 (+    48us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":55}
I20260504 14:10:11.367887  6575 raft_consensus.cc:1275] T cabb9032c1684ac19c8fb9b879df513e P 5967e8b7dd55430493459214df9f79fa [term 1 FOLLOWER]: Refusing update from remote peer c5d5ef2e3b7d49429fd43768e4e3ae9a: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:10:11.368777  6930 consensus_queue.cc:1048] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [LEADER]: Connected to new peer: Peer: permanent_uuid: "5967e8b7dd55430493459214df9f79fa" member_type: VOTER last_known_addr { host: "127.25.254.193" port: 38475 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:10:11.370379  6696 raft_consensus.cc:1275] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556 [term 1 FOLLOWER]: Refusing update from remote peer c5d5ef2e3b7d49429fd43768e4e3ae9a: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260504 14:10:11.376366  6930 consensus_queue.cc:1048] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [LEADER]: Connected to new peer: Peer: permanent_uuid: "93019787649c421e94cbf1022725e556" member_type: VOTER last_known_addr { host: "127.25.254.194" port: 35361 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260504 14:10:11.394402  6933 mvcc.cc:204] Tried to move back new op lower bound from 7282294011351879680 to 7282294010935631872. Current Snapshot: MvccSnapshot[applied={T|T < 7282294011351879680}]
I20260504 14:10:11.395604  6936 mvcc.cc:204] Tried to move back new op lower bound from 7282294011351879680 to 7282294010935631872. Current Snapshot: MvccSnapshot[applied={T|T < 7282294011351879680}]
I20260504 14:10:11.414567  6934 mvcc.cc:204] Tried to move back new op lower bound from 7282294011351879680 to 7282294010935631872. Current Snapshot: MvccSnapshot[applied={T|T < 7282294011351879680}]
I20260504 14:10:11.424523  6392 catalog_manager.cc:2507] Servicing SoftDeleteTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:40320:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:10:11.424746  6392 catalog_manager.cc:2755] Servicing DeleteTable request from {username='test-admin', principal='test-admin@KRBTEST.COM'} at 127.0.0.1:40320:
table { table_name: "test-table" } modify_external_catalogs: true
I20260504 14:10:11.427667  6392 catalog_manager.cc:5958] T 00000000000000000000000000000000 P 0a97dbe8ab0646d5b4416094cac5d927: Sending DeleteTablet for 3 replicas of tablet cabb9032c1684ac19c8fb9b879df513e
I20260504 14:10:11.428673  6554 tablet_service.cc:1558] Processing DeleteTablet for tablet cabb9032c1684ac19c8fb9b879df513e with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:10:11 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:54626
I20260504 14:10:11.429584  6832 tablet_service.cc:1558] Processing DeleteTablet for tablet cabb9032c1684ac19c8fb9b879df513e with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:10:11 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:34682
I20260504 14:10:11.429749  6694 tablet_service.cc:1558] Processing DeleteTablet for tablet cabb9032c1684ac19c8fb9b879df513e with delete_type TABLET_DATA_DELETED (Table deleted at 2026-05-04 14:10:11 UTC) from {username='kudu', principal='kudu/127.25.254.254@KRBTEST.COM'} at 127.0.0.1:48938
I20260504 14:10:11.431303 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 6488
I20260504 14:10:11.444756  6943 tablet_replica.cc:333] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556: stopping tablet replica
I20260504 14:10:11.445254  6943 raft_consensus.cc:2243] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556 [term 1 FOLLOWER]: Raft consensus shutting down.
I20260504 14:10:11.446569  6943 raft_consensus.cc:2272] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556 [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:10:11.450484  6942 tablet_replica.cc:333] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a: stopping tablet replica
I20260504 14:10:11.454342  6942 raft_consensus.cc:2243] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [term 1 LEADER]: Raft consensus shutting down.
I20260504 14:10:11.455742  6943 ts_tablet_manager.cc:1916] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:10:11.456354  6942 raft_consensus.cc:2272] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a [term 1 FOLLOWER]: Raft consensus is shut down!
I20260504 14:10:11.461156  6943 ts_tablet_manager.cc:1929] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:10:11.462734  6942 ts_tablet_manager.cc:1916] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a: Deleting tablet data with delete state TABLET_DATA_DELETED
I20260504 14:10:11.468850  6942 ts_tablet_manager.cc:1929] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 1.2
I20260504 14:10:11.478276  6942 log.cc:1199] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-2/wal/wals/cabb9032c1684ac19c8fb9b879df513e
I20260504 14:10:11.478874  6942 ts_tablet_manager.cc:1950] T cabb9032c1684ac19c8fb9b879df513e P c5d5ef2e3b7d49429fd43768e4e3ae9a: Deleting consensus metadata
I20260504 14:10:11.479521  6943 log.cc:1199] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556: Deleting WAL directory at /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ts-1/wal/wals/cabb9032c1684ac19c8fb9b879df513e
I20260504 14:10:11.479933  6943 ts_tablet_manager.cc:1950] T cabb9032c1684ac19c8fb9b879df513e P 93019787649c421e94cbf1022725e556: Deleting consensus metadata
I20260504 14:10:11.480563  6380 catalog_manager.cc:5002] TS c5d5ef2e3b7d49429fd43768e4e3ae9a (127.25.254.195:40163): tablet cabb9032c1684ac19c8fb9b879df513e (table test-table [id=dffbd474f1ae46afbaedc44c012d0489]) successfully deleted
I20260504 14:10:11.481073  6380 catalog_manager.cc:5002] TS 93019787649c421e94cbf1022725e556 (127.25.254.194:35361): tablet cabb9032c1684ac19c8fb9b879df513e (table test-table [id=dffbd474f1ae46afbaedc44c012d0489]) successfully deleted
W20260504 14:10:11.481215  6378 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv got EOF from 127.25.254.193:38475 (error 108)
I20260504 14:10:11.482336 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 6629
I20260504 14:10:11.489393  6913 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:10:11.488988 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.193:38475 (local address 127.0.0.1:54636)
0504 14:10:11.489220 (+   232us) negotiation.cc:107] Waiting for socket to connect
0504 14:10:11.489292 (+    72us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.193:38475: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":133}
W20260504 14:10:11.489609  6378 catalog_manager.cc:4729] TS 5967e8b7dd55430493459214df9f79fa (127.25.254.193:38475): DeleteTablet:TABLET_DATA_DELETED RPC failed for tablet cabb9032c1684ac19c8fb9b879df513e: Network error: Client connection negotiation failed: client connection to 127.25.254.193:38475: connect: Connection refused (error 111)
I20260504 14:10:11.491323 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 6769
I20260504 14:10:11.501056 26619 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu with pid 6359
I20260504 14:10:11.510593 26619 mini_ranger_kms.cc:60] Stopping Ranger KMS...
14:10:11.514 [shutdown-hook-0] DEBUG org.apache.hadoop.fs.FileSystem -- FileSystem.close() by method: org.apache.hadoop.fs.FilterFileSystem.close(FilterFileSystem.java:529)); Key: (keyadmin@KRBTEST.COM (auth:KERBEROS))@file://; URI: file:///; Object Identity Hash: 16c98373
14:10:11.514 [shutdown-hook-0] DEBUG org.apache.hadoop.fs.FileSystem -- FileSystem.close() by method: org.apache.hadoop.fs.RawLocalFileSystem.close(RawLocalFileSystem.java:895)); Key: null; URI: file:///; Object Identity Hash: 180e7fda
14:10:11.515 [shutdown-hook-0] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- ==> JVMShutdownHook.run()
14:10:11.516 [shutdown-hook-0] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- JVMShutdownHook: Signalling async audit cleanup to start.
14:10:11.516 [Thread-3] DEBUG org.apache.hadoop.util.ShutdownHookManager -- Completed shutdown in 0.003 seconds; Timeouts: 0
14:10:11.516 [Ranger async Audit cleanup] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- RangerAsyncAuditCleanup: Starting cleanup
14:10:11.516 [Ranger async Audit cleanup] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- RangerAsyncAuditCleanup: Done cleanup
14:10:11.516 [Ranger async Audit cleanup] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- RangerAsyncAuditCleanup: Waiting to audit cleanup start signal
14:10:11.516 [shutdown-hook-0] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- JVMShutdownHook: Waiting up to 30 seconds for audit cleanup to finish.
14:10:11.516 [shutdown-hook-0] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- JVMShutdownHook: Audit cleanup finished after 0 milli seconds
14:10:11.516 [shutdown-hook-0] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- JVMShutdownHook: Interrupting ranger async audit cleanup thread
14:10:11.516 [shutdown-hook-0] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- <== JVMShutdownHook.run()
14:10:11.516 [Ranger async Audit cleanup] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- RangerAsyncAuditCleanup: Interrupted while waiting for audit startCleanup signal!  Exiting the thread...
java.lang.InterruptedException: null
	at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1048)
	at java.base/java.util.concurrent.Semaphore.acquire(Semaphore.java:318)
	at org.apache.ranger.audit.provider.AuditProviderFactory$RangerAsyncAuditCleanup.run(AuditProviderFactory.java:503)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:10:11.517 [shutdown-hook-0] DEBUG org.apache.ranger.kms.metrics.KMSMetricWrapper -- ===>> KMSMetricWrapper.writeJsonMetricsToFile()
14:10:11.517 [shutdown-hook-0] INFO kms-metrics -- {}
14:10:11.517 [shutdown-hook-0] DEBUG org.apache.ranger.kms.metrics.KMSMetricWrapper -- <<=== KMSMetricWrapper.writeJsonMetricsToFile()
14:10:11.518 [shutdown-hook-0] DEBUG org.apache.hadoop.fs.FileSystem -- FileSystem.close() by method: org.apache.hadoop.fs.FilterFileSystem.close(FilterFileSystem.java:529)); Key: (rangerkms/127.25.254.212@KRBTEST.COM (auth:KERBEROS))@file://; URI: file:///; Object Identity Hash: 58a4e056
14:10:11.518 [shutdown-hook-0] DEBUG org.apache.hadoop.fs.FileSystem -- FileSystem.close() by method: org.apache.hadoop.fs.RawLocalFileSystem.close(RawLocalFileSystem.java:895)); Key: null; URI: file:///; Object Identity Hash: 1da66526
14:10:11.518 [Thread-7] DEBUG org.apache.hadoop.util.ShutdownHookManager -- Completed shutdown in 0.006 seconds; Timeouts: 0
14:10:11.518 [Thread-7] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(core-default.xml) 
14:10:11.518 [Thread-7] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(core-default.xml): calling componentClassLoader.getResources()
14:10:11.519 [Thread-7] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(core-default.xml): jar:file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar!/core-default.xml
14:10:11.522 [Thread-7] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(core-site.xml) 
14:10:11.523 [Thread-7] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(core-site.xml): calling componentClassLoader.getResources()
14:10:11.523 [Thread-7] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(core-site.xml): file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegration.1777903638260922-26619-0/minicluster-data/ranger-kms/core-site.xml
14:10:11.523 [Thread-7] DEBUG org.apache.hadoop.util.ShutdownHookManager -- ShutdownHookManager completed shutdown.
14:10:11.525 [Thread-3] DEBUG org.apache.hadoop.util.ShutdownHookManager -- ShutdownHookManager completed shutdown.
I20260504 14:10:11.877311 26619 mini_ranger_kms.cc:62] Stopped Ranger KMS
I20260504 14:10:11.877487 26619 mini_ranger.cc:67] Stopping Ranger...
I20260504 14:10:12.233984 26619 mini_ranger.cc:69] Stopped Ranger
2026-05-04 14:10:12.234 UTC [5652] LOG:  received smart shutdown request
2026-05-04 14:10:12.238 UTC [5652] LOG:  background worker "logical replication launcher" (PID 5660) exited with exit code 1
2026-05-04 14:10:12.239 UTC [5655] LOG:  shutting down
2026-05-04 14:10:12.241 UTC [5655] LOG:  checkpoint starting: shutdown immediate
2026-05-04 14:10:13.553 UTC [5655] LOG:  checkpoint complete: wrote 2471 buffers (15.1%); 0 WAL file(s) added, 0 removed, 1 recycled; write=0.039 s, sync=1.264 s, total=1.315 s; sync files=1146, longest=0.008 s, average=0.002 s; distance=12602 kB, estimate=12602 kB; lsn=0/20C9398, redo lsn=0/20C9398
2026-05-04 14:10:13.562 UTC [5652] LOG:  database system is shut down
2026-05-04T14:10:13Z chronyd exiting
[       OK ] SecurityITest.TestEncryptionWithKMSIntegration (75750 ms)
[ RUN      ] SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers
Loading random data
Initializing database '/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/principal' for realm 'KRBTEST.COM',
master key name 'K/M@KRBTEST.COM'
May 04 14:10:13 dist-test-slave-2x32 krb5kdc[6956](info): setting up network...
krb5kdc: setsockopt(10,IPV6_V6ONLY,1) worked
May 04 14:10:13 dist-test-slave-2x32 krb5kdc[6956](info): set up 2 sockets
May 04 14:10:13 dist-test-slave-2x32 krb5kdc[6956](info): commencing operation
krb5kdc: starting...
W20260504 14:10:15.762530 26619 mini_kdc.cc:121] Time spent starting KDC: real 2.050s	user 0.000s	sys 0.007s
WARNING: no policy specified for test-admin@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-admin@KRBTEST.COM" created.
WARNING: no policy specified for test-user@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "test-user@KRBTEST.COM" created.
WARNING: no policy specified for joe-interloper@KRBTEST.COM; defaulting to no policy
Authenticating as principal slave/admin@KRBTEST.COM with password.
Principal "joe-interloper@KRBTEST.COM" created.
Authenticating as principal slave/admin@KRBTEST.COM with password.
Entry for principal test-user with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/test-user.keytab.
Entry for principal test-user with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/test-user.keytab.
May 04 14:10:15 dist-test-slave-2x32 krb5kdc[6956](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903815, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
2026-05-04T14:10:15Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-04T14:10:15Z Disabled control of system clock
WARNING: no policy specified for rangeradmin/127.25.254.212@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "rangeradmin/127.25.254.212@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal rangeradmin/127.25.254.212@KRBTEST.COM with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/rangeradmin_127.25.254.212@KRBTEST.COM.keytab.
Entry for principal rangeradmin/127.25.254.212@KRBTEST.COM with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/rangeradmin_127.25.254.212@KRBTEST.COM.keytab.
WARNING: no policy specified for rangerlookup/127.25.254.212@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "rangerlookup/127.25.254.212@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal rangerlookup/127.25.254.212@KRBTEST.COM with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/rangerlookup_127.25.254.212@KRBTEST.COM.keytab.
Entry for principal rangerlookup/127.25.254.212@KRBTEST.COM with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/rangerlookup_127.25.254.212@KRBTEST.COM.keytab.
WARNING: no policy specified for HTTP/127.25.254.212@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.212@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.212@KRBTEST.COM with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/HTTP_127.25.254.212@KRBTEST.COM.keytab.
Entry for principal HTTP/127.25.254.212@KRBTEST.COM with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/HTTP_127.25.254.212@KRBTEST.COM.keytab.
I20260504 14:10:15.941943 26619 mini_postgres.cc:62] Running initdb...
The files belonging to this database system will be owned by user "slave".
This user must also own the server process.

The database cluster will be initialized with locale "C".
The default database encoding has accordingly been set to "SQL_ASCII".
The default text search configuration will be set to "english".

Data page checksums are disabled.

creating directory /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/postgres ... ok
creating subdirectories ... ok
selecting dynamic shared memory implementation ... posix
selecting default "max_connections" ... 100
selecting default "shared_buffers" ... 128MB
selecting default time zone ... Etc/UTC
creating configuration files ... ok
running bootstrap script ... ok
performing post-bootstrap initialization ... ok
syncing data to disk ... ok

initdb: warning: enabling "trust" authentication for local connections
initdb: hint: You can change this by editing pg_hba.conf or using the option -A, or --auth-local and --auth-host, the next time you run initdb.

Success. You can now start the database server using:

    /tmp/dist-test-taskMMfo7I/build/debug/bin/postgres/pg_ctl -D /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/postgres -l logfile start

2026-05-04 14:10:19.119 UTC [6987] LOG:  starting PostgreSQL 17.2 on x86_64-pc-linux-gnu, compiled by gcc (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0, 64-bit
2026-05-04 14:10:19.119 UTC [6987] LOG:  listening on IPv4 address "127.25.254.212", port 55213
2026-05-04 14:10:19.124 UTC [6987] LOG:  listening on Unix socket "/tmp/.s.PGSQL.55213"
2026-05-04 14:10:19.132 UTC [6992] LOG:  database system was shut down at 2026-05-04 14:10:17 UTC
2026-05-04 14:10:19.139 UTC [6987] LOG:  database system is ready to accept connections
I20260504 14:10:21.078086 26619 mini_postgres.cc:96] Postgres bound to 55213
2026-05-04 14:10:21.084 UTC [6998] FATAL:  database "slave" does not exist
127.25.254.212:55213 - accepting connections
I20260504 14:10:21.085064 26619 mini_ranger.cc:162] Starting Ranger...
I20260504 14:10:21.103806 26619 mini_ranger.cc:85] Created miniranger Postgres user
I20260504 14:10:21.176033 26619 mini_ranger.cc:88] Created ranger Postgres database
I20260504 14:10:21.176167 26619 mini_ranger.cc:179] Starting Ranger out of /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-admin
2026-05-04 14:10:21,576  [I] DB FLAVOR :POSTGRES
2026-05-04 14:10:21,577  [I] --------- Verifying Ranger DB connection ---------
2026-05-04 14:10:21,577  [I] Checking connection..
2026-05-04 14:10:21,577  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select 1;"
2026-05-04 14:10:21,919  [I] Checking connection passed.
2026-05-04 14:10:21,920  [I] --------- Verifying version history table ---------
2026-05-04 14:10:21,920  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select * from (select table_name from information_schema.tables where table_catalog='ranger' and table_name = 'x_db_version_h') as temp;"
2026-05-04 14:10:22,283  [I] Table x_db_version_h does not exist in database ranger
2026-05-04 14:10:22,283  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select * from (select table_name from information_schema.tables where table_catalog='ranger' and table_name = 'x_db_version_h') as temp;"
2026-05-04 14:10:22,605  [I] Table x_db_version_h does not exist in database ranger
2026-05-04 14:10:22,606  [I] Importing x_db_version_h table schema to database ranger from file: create_dbversion_catalog.sql
2026-05-04 14:10:22,606  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \; -input /tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/db/postgres/create_dbversion_catalog.sql 
2026-05-04 14:10:22.933 UTC [7113] WARNING:  there is no transaction in progress
2026-05-04 14:10:22,950  [I] create_dbversion_catalog.sql file imported successfully
2026-05-04 14:10:22,950  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select * from (select table_name from information_schema.tables where table_catalog='ranger' and table_name = 'x_db_version_h') as temp;"
2026-05-04 14:10:23,282  [I] Table x_db_version_h already exists in database 'ranger'
2026-05-04 14:10:23,282  [I] --------- Importing Ranger Core DB Schema ---------
2026-05-04 14:10:23,282  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select version from x_db_version_h where version = 'CORE_DB_SCHEMA' and active = 'Y';"
2026-05-04 14:10:23,585  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select version from x_db_version_h where version = 'CORE_DB_SCHEMA' and active = 'N';"
2026-05-04 14:10:23,954  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "insert into x_db_version_h (version, inst_at, inst_by, updated_at, updated_by,active) values ('CORE_DB_SCHEMA', current_timestamp, 'Ranger 2.6.0', current_timestamp, 'dist-test-slave-2x32.c.gcp-upstream.internal','N') ;"
2026-05-04 14:10:24,317  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select * from (select table_name from information_schema.tables where table_catalog='ranger' and table_name = 'x_portal_user') as temp;"
2026-05-04 14:10:24,668  [I] Table x_portal_user does not exist in database ranger
2026-05-04 14:10:24,668  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select * from (select table_name from information_schema.tables where table_catalog='ranger' and table_name = 'x_policy_ref_group') as temp;"
2026-05-04 14:10:25,006  [I] Table x_policy_ref_group does not exist in database ranger
2026-05-04 14:10:25,007  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select version from x_db_version_h where version = 'DB_PATCHES' and active = 'Y';"
2026-05-04 14:10:25,347  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select version from x_db_version_h where version = 'JAVA_PATCHES' and active = 'Y';"
2026-05-04 14:10:25,660  [I] Importing DB schema to database ranger from file: ranger_core_db_postgres.sql
2026-05-04 14:10:25,660  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \; -input /tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/db/postgres/optimized/current/ranger_core_db_postgres.sql 
2026-05-04 14:10:26.609 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.621 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.636 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.647 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.661 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.676 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.746 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.754 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.766 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.780 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.791 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.802 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.812 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.822 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.828 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.838 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.845 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.854 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.861 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.868 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.873 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:26.877 UTC [7311] WARNING:  there is no transaction in progress
2026-05-04 14:10:27,480  [I] ranger_core_db_postgres.sql file imported successfully
2026-05-04 14:10:27,481  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "update x_db_version_h set inst_by='Ranger 2.6.0' where active='Y' and updated_by='localhost';"
2026-05-04 14:10:27,833  [I] Patches status entries updated from base ranger version to current installed ranger version:Ranger 2.6.0
2026-05-04 14:10:27,833  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select * from (select table_name from information_schema.tables where table_catalog='ranger' and table_name = 'x_portal_user') as temp;"
2026-05-04 14:10:28,187  [I] Table x_portal_user already exists in database 'ranger'
2026-05-04 14:10:28,187  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select * from (select table_name from information_schema.tables where table_catalog='ranger' and table_name = 'x_policy_ref_group') as temp;"
2026-05-04 14:10:28,532  [I] Table x_policy_ref_group already exists in database 'ranger'
2026-05-04 14:10:28,532  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select version from x_db_version_h where version = 'DB_PATCHES' and active = 'Y';"
2026-05-04 14:10:28,858  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select version from x_db_version_h where version = 'JAVA_PATCHES' and active = 'Y';"
2026-05-04 14:10:29,206  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "update x_db_version_h set active='Y' where version='CORE_DB_SCHEMA' and active='N' and updated_by='dist-test-slave-2x32.c.gcp-upstream.internal';"
2026-05-04 14:10:29,577  [I] CORE_DB_SCHEMA import status has been updated
2026-05-04 14:10:29,577  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java  -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/ranger -u miniranger -p '********' -noheader -trim -c \;  -query "select version from x_db_version_h where version = 'DB_PATCHES' and inst_by = 'Ranger 2.6.0' and active = 'Y';"
2026-05-04 14:10:29,903  [I] DB_PATCHES have already been applied
I20260504 14:10:29.910315 26619 mini_ranger.cc:192] Using Ranger class path: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-admin:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/ews/lib/*:/usr/lib/jvm/temurin-17-jdk-amd64/lib/*:/tmp/dist-test-taskMMfo7I/thirdparty/src/hadoop-3.4.1/*:/tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-admin/ews/webapp
I20260504 14:10:29.910419 26619 mini_ranger.cc:194] Using host: 127.25.254.212
I20260504 14:10:29.913484 26619 mini_ranger.cc:240] Ranger admin URL: http://127.25.254.212:40687
May 04, 2026 2:10:30 PM org.apache.ranger.server.tomcat.EmbeddedServer getKeyManagers
WARNING: Config 'ranger.keystore.file' or 'ranger.service.https.attrib.keystore.file' is not found or contains blank value
May 04, 2026 2:10:30 PM org.apache.ranger.server.tomcat.EmbeddedServer getTrustManagers
WARNING: Config 'ranger.truststore.file' is not found or contains blank value!
May 04, 2026 2:10:30 PM org.apache.ranger.server.tomcat.EmbeddedServer start
INFO: Deriving webapp folder from catalina.base property. folder=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-admin/ews/webapp
May 04, 2026 2:10:30 PM org.apache.ranger.server.tomcat.EmbeddedServer start
INFO: Webapp file =/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-admin/ews/webapp, webAppName = /
May 04, 2026 2:10:30 PM org.apache.ranger.server.tomcat.EmbeddedServer start
INFO: Adding webapp [/] = path [/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-admin/ews/webapp] .....
May 04, 2026 2:10:30 PM org.apache.catalina.core.StandardContext setPath
WARNING: A context path must either be an empty string or start with a '/' and do not end with a '/'. The path [/] does not meet these criteria and has been changed to []
May 04, 2026 2:10:31 PM org.apache.ranger.server.tomcat.EmbeddedServer start
INFO: Finished init of webapp [/] = path [/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-admin/ews/webapp].
May 04, 2026 2:10:31 PM org.apache.ranger.server.tomcat.EmbeddedServer startServer
INFO: Server Name : miniranger
May 04, 2026 2:10:31 PM org.apache.coyote.AbstractProtocol init
INFO: Initializing ProtocolHandler ["http-nio-40687"]
May 04, 2026 2:10:31 PM org.apache.catalina.core.StandardService startInternal
INFO: Starting service [Tomcat]
May 04, 2026 2:10:31 PM org.apache.catalina.core.StandardEngine startInternal
INFO: Starting Servlet engine: [Apache Tomcat/9.0.98]
May 04, 2026 2:10:32 PM org.apache.catalina.startup.ContextConfig getDefaultWebXmlFragment
INFO: No global web.xml found
I20260504 14:10:32.333658 26619 mini_ranger.cc:161] Time spent starting Ranger: real 11.249s	user 0.000s	sys 0.011s
May 04, 2026 2:10:36 PM org.apache.jasper.servlet.TldScanner scanJars
INFO: At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
May 04, 2026 2:10:36 PM org.apache.catalina.core.ApplicationContext log
INFO: Initializing Spring root WebApplicationContext
[EL Warning]: metadata: 2026-05-04 14:10:40.125--ServerSession(208972530)--You have specified multiple ids for the entity class [org.apache.ranger.entity.view.VXXPrincipal] without specifying an @IdClass. By doing this you may lose the ability to find by identity, distributed cache support etc. Note: You may however use EntityManager find operations by passing a list of primary key fields. Else, you will have to use JPQL queries to read your entities. For other id options see @PrimaryKey.
May 04, 2026 2:10:56 PM com.sun.jersey.api.core.PackagesResourceConfig init
INFO: Scanning for root resource and provider classes in the packages:
  org.apache.ranger.rest
  org.apache.ranger.common
  xa.rest
May 04, 2026 2:10:56 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
INFO: Root resource classes found:
  class org.apache.ranger.rest.UserREST
  class org.apache.ranger.rest.MetricsREST
  class org.apache.ranger.rest.XUserREST
  class org.apache.ranger.rest.XAuditREST
  class org.apache.ranger.rest.TagREST
  class org.apache.ranger.rest.XKeyREST
  class org.apache.ranger.rest.AssetREST
  class org.apache.ranger.rest.PublicAPIsv2
  class org.apache.ranger.rest.PublicAPIs
  class org.apache.ranger.rest.RoleREST
  class org.apache.ranger.rest.SecurityZoneREST
  class org.apache.ranger.rest.ServiceREST
May 04, 2026 2:10:56 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
INFO: Provider classes found:
  class org.apache.ranger.common.RangerJAXBContextResolver
  class org.apache.ranger.common.RangerJsonProvider
  class org.apache.ranger.common.RangerJsonMappingExceptionMapper
  class org.apache.ranger.common.RangerJsonParserExceptionMapper
May 04, 2026 2:10:56 PM com.sun.jersey.spi.spring.container.servlet.SpringServlet getContext
INFO: Using default applicationContext
May 04, 2026 2:10:56 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, rangerJsonMappingExceptionMapper, of type org.apache.ranger.common.RangerJsonMappingExceptionMapper as a provider class
May 04, 2026 2:10:56 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, rangerJsonParserExceptionMapper, of type org.apache.ranger.common.RangerJsonParserExceptionMapper as a provider class
May 04, 2026 2:10:56 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, rangerJsonProvider, of type org.apache.ranger.common.RangerJsonProvider as a provider class
May 04, 2026 2:10:56 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, assetREST, of type org.apache.ranger.rest.AssetREST as a root resource class
May 04, 2026 2:10:56 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, metricsREST, of type org.apache.ranger.rest.MetricsREST as a root resource class
May 04, 2026 2:10:56 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, publicAPIs, of type org.apache.ranger.rest.PublicAPIs as a root resource class
May 04, 2026 2:10:56 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, publicAPIsv2, of type org.apache.ranger.rest.PublicAPIsv2 as a root resource class
May 04, 2026 2:10:56 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, roleREST, of type org.apache.ranger.rest.RoleREST as a root resource class
May 04, 2026 2:10:56 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, securityZoneREST, of type org.apache.ranger.rest.SecurityZoneREST as a root resource class
May 04, 2026 2:10:56 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, serviceREST, of type org.apache.ranger.rest.ServiceREST as a root resource class
May 04, 2026 2:10:56 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, tagREST, of type org.apache.ranger.rest.TagREST as a root resource class
May 04, 2026 2:10:56 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, userREST, of type org.apache.ranger.rest.UserREST as a root resource class
May 04, 2026 2:10:56 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, XAuditREST, of type org.apache.ranger.rest.XAuditREST as a root resource class
May 04, 2026 2:10:56 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, XKeyREST, of type org.apache.ranger.rest.XKeyREST as a root resource class
May 04, 2026 2:10:56 PM com.sun.jersey.spi.spring.container.SpringComponentProviderFactory registerSpringBeans
INFO: Registering Spring bean, XUserREST, of type org.apache.ranger.rest.XUserREST as a root resource class
May 04, 2026 2:10:56 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.19.4 05/24/2017 03:20 PM'
May 04, 2026 2:10:57 PM com.sun.jersey.spi.inject.Errors processErrorMessages
WARNING: The following warnings have been detected with resource and/or provider classes:
  WARNING: A HTTP GET method, public void org.apache.ranger.rest.RoleREST.getRolesInJson(javax.servlet.http.HttpServletRequest,javax.servlet.http.HttpServletResponse), MUST return a non-void type.
  WARNING: A HTTP GET method, public void org.apache.ranger.rest.ServiceREST.getPoliciesInExcel(javax.servlet.http.HttpServletRequest,javax.servlet.http.HttpServletResponse), MUST return a non-void type.
  WARNING: A HTTP GET method, public void org.apache.ranger.rest.ServiceREST.getPoliciesInCsv(javax.servlet.http.HttpServletRequest,javax.servlet.http.HttpServletResponse) throws java.io.IOException, MUST return a non-void type.
  WARNING: A HTTP GET method, public void org.apache.ranger.rest.ServiceREST.getPoliciesInJson(javax.servlet.http.HttpServletRequest,javax.servlet.http.HttpServletResponse,java.lang.Boolean), MUST return a non-void type.
May 04, 2026 2:10:57 PM org.apache.coyote.AbstractProtocol start
INFO: Starting ProtocolHandler ["http-nio-40687"]
I20260504 14:10:58.400034 26619 mini_ranger.cc:274] Created Kudu service
WARNING: no policy specified for rangerkms/127.25.254.212@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "rangerkms/127.25.254.212@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal rangerkms/127.25.254.212@KRBTEST.COM with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/rangerkms_127.25.254.212@KRBTEST.COM.keytab.
Entry for principal rangerkms/127.25.254.212@KRBTEST.COM with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/rangerkms_127.25.254.212@KRBTEST.COM.keytab.
WARNING: no policy specified for HTTP/127.25.254.212@KRBTEST.COM; defaulting to no policy
add_principal: Principal or policy already exists while creating "HTTP/127.25.254.212@KRBTEST.COM".
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.212@KRBTEST.COM with kvno 3, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/HTTP_127.25.254.212@KRBTEST.COM.keytab.
Entry for principal HTTP/127.25.254.212@KRBTEST.COM with kvno 3, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/HTTP_127.25.254.212@KRBTEST.COM.keytab.
WARNING: no policy specified for keyadmin@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "keyadmin@KRBTEST.COM" created.
May 04 14:10:58 dist-test-slave-2x32 krb5kdc[6956](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903858, etypes {rep=17 tkt=17 ses=17}, keyadmin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for keyadmin@KRBTEST.COM: 
I20260504 14:10:58.515053 26619 mini_ranger_kms.cc:208] Starting Ranger KMS...
I20260504 14:10:58.527753 26619 mini_ranger_kms.cc:78] Created minirangerkms Postgres user
I20260504 14:10:58.665866 26619 mini_ranger_kms.cc:81] Created rangerkms Postgres database
I20260504 14:10:58.665961 26619 mini_ranger_kms.cc:226] Starting Ranger KMS out of /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms
I20260504 14:10:58.665978 26619 mini_ranger_kms.cc:227] Using postgres at 127.25.254.212:55213
2026-05-04 14:10:58,902  [I] DB FLAVOR :POSTGRES
2026-05-04 14:10:58,903  [I] --------- Verifying Ranger DB connection ---------
2026-05-04 14:10:58,903  [I] Checking connection
2026-05-04 14:10:58,903  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java   -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/rangerkms -u rangerkms -p '********' -noheader -trim -c \; -query "SELECT 1;"
2026-05-04 14:10:59,225  [I] connection success
2026-05-04 14:10:59,226  [I] --------- Verifying Ranger DB tables ---------
2026-05-04 14:10:59,226  [I] Verifying table ranger_masterkey in database rangerkms
2026-05-04 14:10:59,226  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java   -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/rangerkms -u rangerkms -p '********' -noheader -trim -c \; -query "select * from (select table_name from information_schema.tables where table_catalog='rangerkms' and table_name = 'ranger_masterkey') as temp;"
2026-05-04 14:10:59,573  [I] Table ranger_masterkey does not exist in database rangerkms
2026-05-04 14:10:59,574  [I] --------- Importing Ranger Core DB Schema ---------
2026-05-04 14:10:59,574  [I] Importing db schema to database rangerkms from file: kms_core_db_postgres.sql
2026-05-04 14:10:59,574  [JISQL] /usr/lib/jvm/temurin-17-jdk-amd64/bin/java   -cp /tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/jisql/lib/* org.apache.util.sql.Jisql -driver postgresql -cstring jdbc:postgresql://127.25.254.212:55213/rangerkms -u rangerkms -p '********' -noheader -trim -c \; -input /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/db/postgres/kms_core_db_postgres.sql
2026-05-04 14:10:59,921  [I] kms_core_db_postgres.sql DB schema imported successfully
I20260504 14:11:00.386389 26619 mini_ranger_kms.cc:326] Created kms service
I20260504 14:11:00.584872 26619 mini_ranger_kms.cc:342] Created kudu user
I20260504 14:11:00.634048 26619 mini_ranger_kms.cc:359] Created rangerkms user
May 04 14:11:00 dist-test-slave-2x32 krb5kdc[6956](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903858, etypes {rep=17 tkt=17 ses=17}, keyadmin@KRBTEST.COM for HTTP/127.25.254.212@KRBTEST.COM
I20260504 14:11:01.043390 26619 mini_ranger_kms.cc:400] Added ranger policy
I20260504 14:11:01.043648 26619 mini_ranger_kms.cc:238] Using RangerKMS classpath: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms:/tmp/dist-test-taskMMfo7I/build/debug/bin/postgresql.jar:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/webapp/WEB-INF/classes/lib/*:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/webapp/WEB-INF/lib/*:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/webapp/lib/*:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/lib/*:/usr/lib/jvm/temurin-17-jdk-amd64/lib/*:/tmp/dist-test-taskMMfo7I/thirdparty/src/hadoop-3.4.1/conf
I20260504 14:11:01.043674 26619 mini_ranger_kms.cc:240] Using host: 127.25.254.212
I20260504 14:11:01.047911 26619 mini_ranger_kms.cc:292] Ranger KMS PID: 7607
I20260504 14:11:01.047995 26619 mini_ranger_kms.cc:293] Ranger KMS URL: http://127.25.254.212:51381
14:11:01.988 [main] DEBUG org.apache.hadoop.util.Shell -- setsid exited with exit code 0
14:11:02.002 [main] DEBUG org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider -- backing jks path initialized to file:/etc/ranger/kms/rangerkms.jceks
14:11:02.129 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"GetGroups"})
14:11:02.130 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Rate of failed kerberos logins and latency (milliseconds)"})
14:11:02.131 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Rate of successful kerberos logins and latency (milliseconds)"})
14:11:02.131 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field private org.apache.hadoop.metrics2.lib.MutableGaugeInt org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailures with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Renewal failures since last successful login"})
14:11:02.132 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field private org.apache.hadoop.metrics2.lib.MutableGaugeLong org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailuresTotal with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Renewal failures since startup"})
14:11:02.137 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- UgiMetrics, User and group related metrics
14:11:02.164 [main] DEBUG org.apache.hadoop.security.SecurityUtil -- Setting hadoop.security.token.service.use_ip to true
14:11:02.195 [main] DEBUG org.apache.hadoop.security.Groups --  Creating new Groups object
14:11:02.221 [main] DEBUG org.apache.hadoop.security.Groups -- Group mapping impl=org.apache.hadoop.security.NullGroupsMapping; cacheTimeout=300000; warningDeltaMs=5000
14:11:02.300 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- Hadoop login
14:11:02.313 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- hadoop login commit
14:11:02.315 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- Using kerberos user: keyadmin@KRBTEST.COM
14:11:02.318 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- Using user: "keyadmin@KRBTEST.COM" with name: keyadmin@KRBTEST.COM
14:11:02.318 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- User entry: "keyadmin@KRBTEST.COM"
14:11:02.318 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- UGI loginUser: keyadmin@KRBTEST.COM (auth:KERBEROS)
14:11:02.322 [TGT Renewer for keyadmin@KRBTEST.COM] DEBUG org.apache.hadoop.security.UserGroupInformation -- Current time is 1777903862322, next refresh is 1777972978000
14:11:02.323 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Starting: Acquiring creator semaphore for file:///etc/ranger/kms/rangerkms.jceks
14:11:02.323 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Acquiring creator semaphore for file:///etc/ranger/kms/rangerkms.jceks: duration 0:00.000s
14:11:02.327 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Starting: Creating FS file:///etc/ranger/kms/rangerkms.jceks
14:11:02.328 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Loading filesystems
14:11:02.341 [main] DEBUG org.apache.hadoop.fs.FileSystem -- file:// = class org.apache.hadoop.fs.LocalFileSystem from /tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:11:02.346 [main] DEBUG org.apache.hadoop.fs.FileSystem -- viewfs:// = class org.apache.hadoop.fs.viewfs.ViewFileSystem from /tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:11:02.349 [main] DEBUG org.apache.hadoop.fs.FileSystem -- har:// = class org.apache.hadoop.fs.HarFileSystem from /tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:11:02.352 [main] DEBUG org.apache.hadoop.fs.FileSystem -- http:// = class org.apache.hadoop.fs.http.HttpFileSystem from /tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:11:02.353 [main] DEBUG org.apache.hadoop.fs.FileSystem -- https:// = class org.apache.hadoop.fs.http.HttpsFileSystem from /tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:11:02.354 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Looking for FS supporting file
14:11:02.354 [main] DEBUG org.apache.hadoop.fs.FileSystem -- looking for configuration option fs.file.impl
14:11:02.354 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Looking in service filesystems for implementation class
14:11:02.354 [main] DEBUG org.apache.hadoop.fs.FileSystem -- FS for file is class org.apache.hadoop.fs.LocalFileSystem
14:11:02.365 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Creating FS file:///etc/ranger/kms/rangerkms.jceks: duration 0:00.038s
May 04, 2026 2:11:02 PM org.apache.ranger.server.tomcat.EmbeddedServer getKeyManagers
WARNING: Config 'ranger.keystore.file' or 'ranger.service.https.attrib.keystore.file' is not found or contains blank value
May 04, 2026 2:11:02 PM org.apache.ranger.server.tomcat.EmbeddedServer getTrustManagers
WARNING: Config 'ranger.truststore.file' is not found or contains blank value!
May 04, 2026 2:11:02 PM org.apache.ranger.server.tomcat.EmbeddedServer start
INFO: Webapp file =/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp, webAppName = /kms
May 04, 2026 2:11:02 PM org.apache.ranger.server.tomcat.EmbeddedServer start
INFO: Adding webapp [/kms] = path [/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp] .....
May 04, 2026 2:11:02 PM org.apache.ranger.server.tomcat.EmbeddedServer start
INFO: Finished init of webapp [/kms] = path [/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp].
May 04, 2026 2:11:02 PM org.apache.ranger.server.tomcat.EmbeddedServer startServer
INFO: Server Name : minirangerkms
May 04, 2026 2:11:03 PM org.apache.coyote.AbstractProtocol init
INFO: Initializing ProtocolHandler ["http-nio-51381"]
May 04, 2026 2:11:03 PM org.apache.catalina.core.StandardService startInternal
INFO: Starting service [Tomcat]
May 04, 2026 2:11:03 PM org.apache.catalina.core.StandardEngine startInternal
INFO: Starting Servlet engine: [Apache Tomcat/9.0.98]
I20260504 14:11:03.402554 26619 mini_ranger_kms.cc:207] Time spent starting Ranger KMS: real 4.888s	user 0.014s	sys 0.193s
I20260504 14:11:03.402676 26619 mini_ranger_kms.cc:413] {"name":"kuduclusterkey","cipher":"AES/CTR/NoPadding","length":128,"description":"kuduclusterkey"}
I20260504 14:11:03.402755 26619 mini_ranger_kms.cc:417] 127.25.254.212:51381/kms/v1/keys
May 04, 2026 2:11:03 PM org.apache.catalina.startup.ContextConfig getDefaultWebXmlFragment
INFO: No global web.xml found
May 04, 2026 2:11:16 PM org.apache.jasper.servlet.TldScanner scanJars
INFO: At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
14:11:16.322 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"GetGroups"})
14:11:16.326 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Rate of failed kerberos logins and latency (milliseconds)"})
14:11:16.326 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Rate of successful kerberos logins and latency (milliseconds)"})
14:11:16.327 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field private org.apache.hadoop.metrics2.lib.MutableGaugeInt org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailures with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Renewal failures since last successful login"})
14:11:16.327 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field private org.apache.hadoop.metrics2.lib.MutableGaugeLong org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailuresTotal with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Renewal failures since startup"})
14:11:16.329 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- UgiMetrics, User and group related metrics
14:11:16.402 [main] DEBUG org.apache.hadoop.util.Shell -- setsid exited with exit code 0
14:11:16.402 [main] DEBUG org.apache.hadoop.security.SecurityUtil -- Setting hadoop.security.token.service.use_ip to true
14:11:16.427 [main] DEBUG org.apache.hadoop.security.Groups --  Creating new Groups object
14:11:16.509 [main] DEBUG org.apache.hadoop.security.Groups -- Group mapping impl=org.apache.hadoop.security.NullGroupsMapping; cacheTimeout=300000; warningDeltaMs=5000
14:11:16.510 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- -------------------------------------------------------------
14:11:16.510 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp --   Java runtime version : 17.0.18+8
14:11:16.516 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp --   KMS Hadoop Version: 3.3.6
14:11:16.516 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- -------------------------------------------------------------
14:11:16.519 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.RangerKmsAuthorizer()
14:11:16.519 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.init()
14:11:16.522 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- ==> RangerPluginClassLoaderUtil.getPluginFilesForServiceTypeAndPluginclass(kms) Pluging Class :org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer
14:11:16.522 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- ==> RangerPluginClassLoaderUtil.getPluginImplLibPath for Class (org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer)
14:11:16.523 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- <== RangerPluginClassLoaderUtil.getPluginImplLibPath for Class (org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer PATH :/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl)
14:11:16.523 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- ==> RangerPluginClassLoaderUtil.getPluginFiles()
14:11:16.523 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- ==> RangerPluginClassLoaderUtil.getPluginFiles()
14:11:16.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/lucene-core-8.11.3.jar
14:11:16.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/ranger-plugins-audit-2.6.0.jar
14:11:16.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/elasticsearch-rest-high-level-client-7.10.2.jar
14:11:16.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/jna-5.7.0.jar
14:11:16.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/gethostname4j-1.0.0.jar
14:11:16.524 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/elasticsearch-rest-client-7.10.2.jar
14:11:16.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/httpclient-4.5.13.jar
14:11:16.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/lang-mustache-client-7.10.2.jar
14:11:16.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/gson-2.9.0.jar
14:11:16.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/joda-time-2.10.6.jar
14:11:16.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/rank-eval-client-7.10.2.jar
14:11:16.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/httpcore-nio-4.4.14.jar
14:11:16.525 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/httpmime-4.5.13.jar
14:11:16.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/ranger-kms-plugin-2.6.0.jar
14:11:16.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/jna-platform-5.7.0.jar
14:11:16.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/hppc-0.8.0.jar
14:11:16.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/commons-collections-3.2.2.jar
14:11:16.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/solr-solrj-8.11.3.jar
14:11:16.526 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/hive-storage-api-2.7.2.jar
14:11:16.527 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/zookeeper-3.9.2.jar
14:11:16.527 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/commons-logging-1.2.jar
14:11:16.527 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/elasticsearch-7.10.2.jar
14:11:16.527 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/orc-core-1.5.8.jar
14:11:16.527 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/javax.persistence-2.1.0.jar
14:11:16.527 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/commons-configuration2-2.8.0.jar
14:11:16.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/orc-shims-1.5.8.jar
14:11:16.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/httpasyncclient-4.1.4.jar
14:11:16.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/aircompressor-0.27.jar
14:11:16.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/ranger-plugins-cred-2.6.0.jar
14:11:16.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/elasticsearch-x-content-7.10.2.jar
14:11:16.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/ranger-plugins-common-2.6.0.jar
14:11:16.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/eclipselink-2.7.12.jar
14:11:16.528 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/httpcore-4.4.14.jar
14:11:16.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- getFilesInDirectory('/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl'): adding /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/elasticsearch-core-7.10.2.jar
14:11:16.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- <== RangerPluginClassLoaderUtil.getFilesInDirectory(/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl)
14:11:16.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- <== RangerPluginClassLoaderUtil.getPluginFilesForServiceType(): 34 files
14:11:16.529 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoaderUtil -- <== RangerPluginClassLoaderUtil.getPluginFilesForServiceTypeAndPluginclass(kms) Pluging Class :org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer
14:11:16.531 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer)
14:11:16.531 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer): calling childClassLoader.findClass()
14:11:16.531 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer)
14:11:16.531 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer): calling childClassLoader().findClass() 
14:11:16.536 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Runnable)
14:11:16.536 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Runnable): calling childClassLoader.findClass()
14:11:16.536 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Runnable): interface java.lang.Runnable
14:11:16.537 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider$KeyACLs)
14:11:16.537 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider$KeyACLs): calling childClassLoader.findClass()
14:11:16.537 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider$KeyACLs)
14:11:16.537 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider$KeyACLs): calling childClassLoader().findClass() 
14:11:16.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider$KeyACLs): calling componentClassLoader.findClass()
14:11:16.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider$KeyACLs): calling componentClassLoader.loadClass()
14:11:16.546 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider$KeyACLs): interface org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider$KeyACLs
14:11:16.547 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Object)
14:11:16.547 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Object): calling childClassLoader.findClass()
14:11:16.547 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Object): class java.lang.Object
14:11:16.547 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer): class org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer
14:11:16.547 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer): class org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer
14:11:16.547 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Map)
14:11:16.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Map): calling childClassLoader.findClass()
14:11:16.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Map): interface java.util.Map
14:11:16.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Throwable)
14:11:16.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Throwable): calling childClassLoader.findClass()
14:11:16.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Throwable): class java.lang.Throwable
14:11:16.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Exception)
14:11:16.548 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Exception): calling childClassLoader.findClass()
14:11:16.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Exception): class java.lang.Exception
14:11:16.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authorize.AuthorizationException)
14:11:16.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authorize.AuthorizationException): calling childClassLoader.findClass()
14:11:16.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.security.authorize.AuthorizationException)
14:11:16.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.authorize.AuthorizationException): calling childClassLoader().findClass() 
14:11:16.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.authorize.AuthorizationException): calling componentClassLoader.findClass()
14:11:16.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authorize.AuthorizationException): calling componentClassLoader.loadClass()
14:11:16.549 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authorize.AuthorizationException): class org.apache.hadoop.security.authorize.AuthorizationException
14:11:16.550 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest)
14:11:16.550 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest): calling childClassLoader.findClass()
14:11:16.550 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest)
14:11:16.550 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest): calling childClassLoader().findClass() 
14:11:16.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest): interface org.apache.ranger.plugin.policyengine.RangerAccessRequest
14:11:16.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest): interface org.apache.ranger.plugin.policyengine.RangerAccessRequest
14:11:16.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.net.UnknownHostException)
14:11:16.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.net.UnknownHostException): calling childClassLoader.findClass()
14:11:16.551 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.net.UnknownHostException): class java.net.UnknownHostException
14:11:16.552 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.IOException)
14:11:16.552 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.IOException): calling childClassLoader.findClass()
14:11:16.552 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.IOException): class java.io.IOException
14:11:16.552 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.slf4j.LoggerFactory)
14:11:16.552 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.slf4j.LoggerFactory): calling childClassLoader.findClass()
14:11:16.552 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.slf4j.LoggerFactory)
14:11:16.552 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.slf4j.LoggerFactory): calling childClassLoader().findClass() 
14:11:16.553 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.slf4j.LoggerFactory): calling componentClassLoader.findClass()
14:11:16.553 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.slf4j.LoggerFactory): calling componentClassLoader.loadClass()
14:11:16.553 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.slf4j.LoggerFactory): class org.slf4j.LoggerFactory
14:11:16.553 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPerfTracer)
14:11:16.553 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPerfTracer): calling childClassLoader.findClass()
14:11:16.553 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPerfTracer)
14:11:16.553 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPerfTracer): calling childClassLoader().findClass() 
14:11:16.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPerfTracer): class org.apache.ranger.plugin.util.RangerPerfTracer
14:11:16.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPerfTracer): class org.apache.ranger.plugin.util.RangerPerfTracer
14:11:16.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.StringBuilder)
14:11:16.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.StringBuilder): calling childClassLoader.findClass()
14:11:16.554 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.StringBuilder): class java.lang.StringBuilder
14:11:16.555 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.HashMap)
14:11:16.555 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.HashMap): calling childClassLoader.findClass()
14:11:16.555 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.HashMap): class java.util.HashMap
14:11:16.555 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KMSACLsType$Type)
14:11:16.555 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KMSACLsType$Type): calling childClassLoader.findClass()
14:11:16.555 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.crypto.key.kms.server.KMSACLsType$Type)
14:11:16.555 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.crypto.key.kms.server.KMSACLsType$Type): calling childClassLoader().findClass() 
14:11:16.556 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.crypto.key.kms.server.KMSACLsType$Type): calling componentClassLoader.findClass()
14:11:16.556 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KMSACLsType$Type): calling componentClassLoader.loadClass()
14:11:16.557 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KMSACLsType$Type): class org.apache.hadoop.crypto.key.kms.server.KMSACLsType$Type
14:11:16.557 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:16.557 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:16.557 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.conf.Configuration)
14:11:16.557 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.conf.Configuration): calling childClassLoader.findClass()
14:11:16.557 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.conf.Configuration)
14:11:16.557 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.conf.Configuration): calling childClassLoader().findClass() 
14:11:16.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.conf.Configuration): calling componentClassLoader.findClass()
14:11:16.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.conf.Configuration): calling componentClassLoader.loadClass()
14:11:16.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.conf.Configuration): class org.apache.hadoop.conf.Configuration
14:11:16.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.slf4j.Logger)
14:11:16.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.slf4j.Logger): calling childClassLoader.findClass()
14:11:16.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.slf4j.Logger)
14:11:16.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.slf4j.Logger): calling childClassLoader().findClass() 
14:11:16.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.slf4j.Logger): calling componentClassLoader.findClass()
14:11:16.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.slf4j.Logger): calling componentClassLoader.loadClass()
14:11:16.558 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.slf4j.Logger): interface org.slf4j.Logger
14:11:16.559 [main] INFO org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- RangerKmsAuthorizer(conf)...
14:11:16.559 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- Loading ACLs file
14:11:16.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.System)
14:11:16.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.System): calling childClassLoader.findClass()
14:11:16.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.System): class java.lang.System
14:11:16.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KMSConfiguration)
14:11:16.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KMSConfiguration): calling childClassLoader.findClass()
14:11:16.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.crypto.key.kms.server.KMSConfiguration)
14:11:16.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.crypto.key.kms.server.KMSConfiguration): calling childClassLoader().findClass() 
14:11:16.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.crypto.key.kms.server.KMSConfiguration): calling componentClassLoader.findClass()
14:11:16.559 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KMSConfiguration): calling componentClassLoader.loadClass()
14:11:16.560 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.crypto.key.kms.server.KMSConfiguration): class org.apache.hadoop.crypto.key.kms.server.KMSConfiguration
14:11:16.568 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- Hadoop login
14:11:16.569 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- hadoop login commit
14:11:16.570 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- Using kerberos user: keyadmin@KRBTEST.COM
14:11:16.571 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- Using user: "keyadmin@KRBTEST.COM" with name: keyadmin@KRBTEST.COM
14:11:16.571 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- User entry: "keyadmin@KRBTEST.COM"
14:11:16.571 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- UGI loginUser: keyadmin@KRBTEST.COM (auth:KERBEROS)
14:11:16.578 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(core-default.xml) 
14:11:16.578 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(core-default.xml): calling componentClassLoader.getResources()
14:11:16.579 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(core-default.xml): jar:file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar!/core-default.xml
14:11:16.584 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(core-site.xml) 
14:11:16.585 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(core-site.xml): calling componentClassLoader.getResources()
14:11:16.586 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(core-site.xml): file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/core-site.xml
14:11:16.587 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.net.InetAddress)
14:11:16.587 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.net.InetAddress): calling childClassLoader.findClass()
14:11:16.587 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.net.InetAddress): class java.net.InetAddress
14:11:16.585 [TGT Renewer for keyadmin@KRBTEST.COM] DEBUG org.apache.hadoop.security.UserGroupInformation -- Current time is 1777903876585, next refresh is 1777972978000
14:11:16.594 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.security.SecureClientLogin)
14:11:16.595 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.SecureClientLogin): calling childClassLoader.findClass()
14:11:16.595 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.security.SecureClientLogin)
14:11:16.595 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.SecureClientLogin): calling childClassLoader().findClass() 
14:11:16.596 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.hadoop.security.SecureClientLogin): class org.apache.hadoop.security.SecureClientLogin
14:11:16.596 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.security.SecureClientLogin): class org.apache.hadoop.security.SecureClientLogin
14:11:16.596 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.security.auth.login.LoginException)
14:11:16.596 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.security.auth.login.LoginException): calling childClassLoader.findClass()
14:11:16.596 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.security.auth.login.LoginException): class javax.security.auth.login.LoginException
14:11:16.596 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.security.auth.login.Configuration)
14:11:16.596 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.security.auth.login.Configuration): calling childClassLoader.findClass()
14:11:16.596 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.security.auth.login.Configuration): class javax.security.auth.login.Configuration
14:11:16.596 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.security.SecureClientLoginConfiguration)
14:11:16.597 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.SecureClientLoginConfiguration): calling childClassLoader.findClass()
14:11:16.597 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.security.SecureClientLoginConfiguration)
14:11:16.597 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.SecureClientLoginConfiguration): calling childClassLoader().findClass() 
14:11:16.597 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.hadoop.security.SecureClientLoginConfiguration): class org.apache.hadoop.security.SecureClientLoginConfiguration
14:11:16.597 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.security.SecureClientLoginConfiguration): class org.apache.hadoop.security.SecureClientLoginConfiguration
14:11:16.597 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.security.Principal)
14:11:16.597 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.security.Principal): calling childClassLoader.findClass()
14:11:16.597 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.security.Principal): interface java.security.Principal
14:11:16.598 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Collection)
14:11:16.598 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Collection): calling childClassLoader.findClass()
14:11:16.598 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Collection): interface java.util.Collection
14:11:16.598 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Set)
14:11:16.598 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Set): calling childClassLoader.findClass()
14:11:16.598 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Set): interface java.util.Set
14:11:16.598 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.String)
14:11:16.598 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.String): calling childClassLoader.findClass()
14:11:16.598 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.String): class java.lang.String
14:11:16.599 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- Ranger KMS Principal : rangerkms/127.25.254.212@KRBTEST.COM, Keytab : /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/rangerkms_127.25.254.212@KRBTEST.COM.keytab, NameRule : RULE:[2:$1@$0](rangeradmin@KRBTEST.COM)s/(.*)@KRBTEST.COM/ranger/
RULE:[2:$1@$0](rangertagsync@KRBTEST.COM)s/(.*)@KRBTEST.COM/rangertagsync/
RULE:[2:$1@$0](rangerusersync@KRBTEST.COM)s/(.*)@KRBTEST.COM/rangerusersync/
RULE:[2:$1@$0](rangerkms@KRBTEST.COM)s/(.*)@KRBTEST.COM/keyadmin/
RULE:[2:$1@$0](atlas@KRBTEST.COM)s/(.*)@KRBTEST.COM/atlas/
DEFAULT
14:11:16.599 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.MiscUtil)
14:11:16.599 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.MiscUtil): calling childClassLoader.findClass()
14:11:16.599 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.MiscUtil)
14:11:16.599 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.MiscUtil): calling childClassLoader().findClass() 
14:11:16.601 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.MiscUtil): class org.apache.ranger.audit.provider.MiscUtil
14:11:16.601 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.MiscUtil): class org.apache.ranger.audit.provider.MiscUtil
14:11:16.601 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.ThreadLocal)
14:11:16.601 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.ThreadLocal): calling childClassLoader.findClass()
14:11:16.601 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.ThreadLocal): class java.lang.ThreadLocal
14:11:16.601 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.MiscUtil$1)
14:11:16.601 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.MiscUtil$1): calling childClassLoader.findClass()
14:11:16.601 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.MiscUtil$1)
14:11:16.601 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.MiscUtil$1): calling childClassLoader().findClass() 
14:11:16.602 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.MiscUtil$1): class org.apache.ranger.audit.provider.MiscUtil$1
14:11:16.602 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.MiscUtil$1): class org.apache.ranger.audit.provider.MiscUtil$1
14:11:16.602 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.List)
14:11:16.602 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.List): calling childClassLoader.findClass()
14:11:16.602 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.List): interface java.util.List
14:11:16.602 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.CharSequence)
14:11:16.602 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.CharSequence): calling childClassLoader.findClass()
14:11:16.602 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.CharSequence): interface java.lang.CharSequence
14:11:16.602 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.NumberFormatException)
14:11:16.602 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.NumberFormatException): calling childClassLoader.findClass()
14:11:16.603 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.NumberFormatException): class java.lang.NumberFormatException
14:11:16.603 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.MiscUtil$KerberosConfiguration)
14:11:16.603 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.MiscUtil$KerberosConfiguration): calling childClassLoader.findClass()
14:11:16.603 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.MiscUtil$KerberosConfiguration)
14:11:16.603 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.MiscUtil$KerberosConfiguration): calling childClassLoader().findClass() 
14:11:16.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.MiscUtil$KerberosConfiguration): class org.apache.ranger.audit.provider.MiscUtil$KerberosConfiguration
14:11:16.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.MiscUtil$KerberosConfiguration): class org.apache.ranger.audit.provider.MiscUtil$KerberosConfiguration
14:11:16.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.rmi.dgc.VMID)
14:11:16.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.rmi.dgc.VMID): calling childClassLoader.findClass()
14:11:16.605 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.rmi.dgc.VMID): class java.rmi.dgc.VMID
14:11:16.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.text.DateFormat)
14:11:16.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.text.DateFormat): calling childClassLoader.findClass()
14:11:16.606 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.text.DateFormat): class java.text.DateFormat
14:11:16.607 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.text.SimpleDateFormat)
14:11:16.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.text.SimpleDateFormat): calling childClassLoader.findClass()
14:11:16.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.text.SimpleDateFormat): class java.text.SimpleDateFormat
14:11:16.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Hashtable)
14:11:16.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Hashtable): calling childClassLoader.findClass()
14:11:16.608 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Hashtable): class java.util.Hashtable
14:11:16.608 [main] DEBUG org.apache.ranger.audit.provider.MiscUtil -- ==> MiscUtil.initLocalHost()
14:11:16.608 [main] DEBUG org.apache.ranger.audit.provider.MiscUtil -- <== MiscUtil.initLocalHost()
14:11:16.609 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.security.auth.Subject)
14:11:16.624 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.security.auth.Subject): calling childClassLoader.findClass()
14:11:16.624 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.security.auth.Subject): class javax.security.auth.Subject
14:11:16.625 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authentication.util.KerberosName)
14:11:16.625 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authentication.util.KerberosName): calling childClassLoader.findClass()
14:11:16.625 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.security.authentication.util.KerberosName)
14:11:16.625 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.authentication.util.KerberosName): calling childClassLoader().findClass() 
14:11:16.625 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.authentication.util.KerberosName): calling componentClassLoader.findClass()
14:11:16.625 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authentication.util.KerberosName): calling componentClassLoader.loadClass()
14:11:16.625 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authentication.util.KerberosName): class org.apache.hadoop.security.authentication.util.KerberosName
14:11:16.626 [main] INFO org.apache.ranger.audit.provider.MiscUtil -- Creating UGI from keytab directly. keytab=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/rangerkms_127.25.254.212@KRBTEST.COM.keytab, principal=rangerkms/127.25.254.212@KRBTEST.COM
14:11:16.626 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.security.UserGroupInformation)
14:11:16.626 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.UserGroupInformation): calling childClassLoader.findClass()
14:11:16.626 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.security.UserGroupInformation)
14:11:16.626 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.UserGroupInformation): calling childClassLoader().findClass() 
14:11:16.626 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.UserGroupInformation): calling componentClassLoader.findClass()
14:11:16.626 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.UserGroupInformation): calling componentClassLoader.loadClass()
14:11:16.626 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.security.UserGroupInformation): class org.apache.hadoop.security.UserGroupInformation
May 04 14:11:16 dist-test-slave-2x32 krb5kdc[6956](info): AS_REQ (1 etypes {17}) 127.0.0.1: ISSUE: authtime 1777903876, etypes {rep=17 tkt=17 ses=17}, rangerkms/127.25.254.212@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
14:11:16.656 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- Hadoop login
14:11:16.656 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- hadoop login commit
14:11:16.657 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- Using kerberos user: rangerkms/127.25.254.212@KRBTEST.COM
14:11:16.657 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- Using user: "rangerkms/127.25.254.212@KRBTEST.COM" with name: rangerkms/127.25.254.212@KRBTEST.COM
14:11:16.657 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- User entry: "rangerkms/127.25.254.212@KRBTEST.COM"
14:11:16.657 [main] INFO org.apache.ranger.audit.provider.MiscUtil -- Setting UGI=rangerkms/127.25.254.212@KRBTEST.COM (auth:KERBEROS)
14:11:16.657 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authorize.AccessControlList)
14:11:16.657 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authorize.AccessControlList): calling childClassLoader.findClass()
14:11:16.658 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.security.authorize.AccessControlList)
14:11:16.658 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.authorize.AccessControlList): calling childClassLoader().findClass() 
14:11:16.658 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.authorize.AccessControlList): calling componentClassLoader.findClass()
14:11:16.658 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authorize.AccessControlList): calling componentClassLoader.loadClass()
14:11:16.659 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.security.authorize.AccessControlList): class org.apache.hadoop.security.authorize.AccessControlList
14:11:16.661 [main] INFO org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- 'DECRYPT_EEK' Blacklist 'hdfs'
14:11:16.662 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.init()
14:11:16.662 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin)
14:11:16.662 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin): calling childClassLoader.findClass()
14:11:16.662 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin)
14:11:16.662 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin): calling childClassLoader().findClass() 
14:11:16.662 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.service.RangerBasePlugin)
14:11:16.662 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.service.RangerBasePlugin): calling childClassLoader.findClass()
14:11:16.662 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.service.RangerBasePlugin)
14:11:16.662 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.service.RangerBasePlugin): calling childClassLoader().findClass() 
14:11:16.664 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.service.RangerBasePlugin): class org.apache.ranger.plugin.service.RangerBasePlugin
14:11:16.664 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.service.RangerBasePlugin): class org.apache.ranger.plugin.service.RangerBasePlugin
14:11:16.665 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin): class org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin
14:11:16.665 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin): class org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin
14:11:16.665 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.RuntimeException)
14:11:16.665 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.RuntimeException): calling childClassLoader.findClass()
14:11:16.665 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.RuntimeException): class java.lang.RuntimeException
14:11:16.665 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.hadoop.config.RangerPluginConfig)
14:11:16.665 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.hadoop.config.RangerPluginConfig): calling childClassLoader.findClass()
14:11:16.665 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.hadoop.config.RangerPluginConfig)
14:11:16.665 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.hadoop.config.RangerPluginConfig): calling childClassLoader().findClass() 
14:11:16.666 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.hadoop.config.RangerConfiguration)
14:11:16.666 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.hadoop.config.RangerConfiguration): calling childClassLoader.findClass()
14:11:16.666 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.hadoop.config.RangerConfiguration)
14:11:16.666 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.hadoop.config.RangerConfiguration): calling childClassLoader().findClass() 
14:11:16.667 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.hadoop.config.RangerConfiguration): class org.apache.ranger.authorization.hadoop.config.RangerConfiguration
14:11:16.667 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.hadoop.config.RangerConfiguration): class org.apache.ranger.authorization.hadoop.config.RangerConfiguration
14:11:16.668 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.hadoop.config.RangerPluginConfig): class org.apache.ranger.authorization.hadoop.config.RangerPluginConfig
14:11:16.668 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.hadoop.config.RangerPluginConfig): class org.apache.ranger.authorization.hadoop.config.RangerPluginConfig
14:11:16.668 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditProviderFactory)
14:11:16.668 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditProviderFactory): calling childClassLoader.findClass()
14:11:16.668 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditProviderFactory)
14:11:16.668 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditProviderFactory): calling childClassLoader().findClass() 
14:11:16.669 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditProviderFactory): class org.apache.ranger.audit.provider.AuditProviderFactory
14:11:16.669 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditProviderFactory): class org.apache.ranger.audit.provider.AuditProviderFactory
14:11:16.669 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.StandAloneAuditProviderFactory)
14:11:16.669 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.StandAloneAuditProviderFactory): calling childClassLoader.findClass()
14:11:16.669 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.StandAloneAuditProviderFactory)
14:11:16.669 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.StandAloneAuditProviderFactory): calling childClassLoader().findClass() 
14:11:16.670 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.StandAloneAuditProviderFactory): class org.apache.ranger.audit.provider.StandAloneAuditProviderFactory
14:11:16.670 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.StandAloneAuditProviderFactory): class org.apache.ranger.audit.provider.StandAloneAuditProviderFactory
14:11:16.670 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngine)
14:11:16.670 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngine): calling childClassLoader.findClass()
14:11:16.670 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngine)
14:11:16.670 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngine): calling childClassLoader().findClass() 
14:11:16.671 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngine): interface org.apache.ranger.plugin.policyengine.RangerPolicyEngine
14:11:16.671 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngine): interface org.apache.ranger.plugin.policyengine.RangerPolicyEngine
14:11:16.671 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.RangerAdminClient)
14:11:16.671 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.RangerAdminClient): calling childClassLoader.findClass()
14:11:16.671 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.RangerAdminClient)
14:11:16.671 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.RangerAdminClient): calling childClassLoader().findClass() 
14:11:16.671 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.RangerAdminClient): interface org.apache.ranger.admin.client.RangerAdminClient
14:11:16.672 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.RangerAdminClient): interface org.apache.ranger.admin.client.RangerAdminClient
14:11:16.672 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.InterruptedException)
14:11:16.672 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.InterruptedException): calling childClassLoader.findClass()
14:11:16.672 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.InterruptedException): class java.lang.InterruptedException
14:11:16.672 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResource)
14:11:16.672 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResource): calling childClassLoader.findClass()
14:11:16.672 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResource)
14:11:16.672 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResource): calling childClassLoader().findClass() 
14:11:16.672 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResource): interface org.apache.ranger.plugin.policyengine.RangerAccessResource
14:11:16.672 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResource): interface org.apache.ranger.plugin.policyengine.RangerAccessResource
14:11:16.673 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResultProcessor)
14:11:16.673 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResultProcessor): calling childClassLoader.findClass()
14:11:16.673 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResultProcessor)
14:11:16.673 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResultProcessor): calling childClassLoader().findClass() 
14:11:16.673 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResultProcessor): interface org.apache.ranger.plugin.policyengine.RangerAccessResultProcessor
14:11:16.673 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResultProcessor): interface org.apache.ranger.plugin.policyengine.RangerAccessResultProcessor
14:11:16.674 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.net.MalformedURLException)
14:11:16.674 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.net.MalformedURLException): calling childClassLoader.findClass()
14:11:16.674 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.net.MalformedURLException): class java.net.MalformedURLException
14:11:16.674 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Collections)
14:11:16.674 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Collections): calling childClassLoader.findClass()
14:11:16.674 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Collections): class java.util.Collections
14:11:16.675 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- ==> addResourceIfReadable(ranger-kms-audit.xml)
14:11:16.675 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.lang.StringUtils)
14:11:16.675 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.lang.StringUtils): calling childClassLoader.findClass()
14:11:16.675 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.lang.StringUtils)
14:11:16.675 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.lang.StringUtils): calling childClassLoader().findClass() 
14:11:16.675 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.lang.StringUtils): calling componentClassLoader.findClass()
14:11:16.675 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.lang.StringUtils): calling componentClassLoader.loadClass()
14:11:16.676 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.lang.StringUtils): class org.apache.commons.lang.StringUtils
14:11:16.676 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Class)
14:11:16.676 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Class): calling childClassLoader.findClass()
14:11:16.676 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Class): class java.lang.Class
14:11:16.676 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.ClassLoader)
14:11:16.676 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.ClassLoader): calling childClassLoader.findClass()
14:11:16.676 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.ClassLoader): class java.lang.ClassLoader
14:11:16.676 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(ranger-kms-audit.xml) 
14:11:16.677 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(ranger-kms-audit.xml): calling componentClassLoader.getResources()
14:11:16.677 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(ranger-kms-audit.xml): null
14:11:16.677 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(/ranger-kms-audit.xml) 
14:11:16.678 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(/ranger-kms-audit.xml): calling componentClassLoader.getResources()
14:11:16.678 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(/ranger-kms-audit.xml): null
14:11:16.678 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.File)
14:11:16.678 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.File): calling childClassLoader.findClass()
14:11:16.678 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.File): class java.io.File
14:11:16.678 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- Conf file path ranger-kms-audit.xml does not exists
14:11:16.678 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- addResourceIfReadable(ranger-kms-audit.xml): couldn't find resource file location
14:11:16.678 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- <== addResourceIfReadable(ranger-kms-audit.xml), result=false
14:11:16.678 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerPluginConfig -- ==> addAuditResource(Service Type: kms
14:11:16.678 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.hadoop.config.RangerLegacyConfigBuilder)
14:11:16.678 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.hadoop.config.RangerLegacyConfigBuilder): calling childClassLoader.findClass()
14:11:16.678 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.hadoop.config.RangerLegacyConfigBuilder)
14:11:16.679 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.hadoop.config.RangerLegacyConfigBuilder): calling childClassLoader().findClass() 
14:11:16.679 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.hadoop.config.RangerLegacyConfigBuilder): class org.apache.ranger.authorization.hadoop.config.RangerLegacyConfigBuilder
14:11:16.679 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.hadoop.config.RangerLegacyConfigBuilder): class org.apache.ranger.authorization.hadoop.config.RangerLegacyConfigBuilder
14:11:16.680 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(hive-site.xml) 
14:11:16.680 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(hive-site.xml): calling componentClassLoader.getResources()
14:11:16.680 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(hive-site.xml): null
14:11:16.681 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(/hive-site.xml) 
14:11:16.681 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(/hive-site.xml): calling componentClassLoader.getResources()
14:11:16.681 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(/hive-site.xml): null
14:11:16.681 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(hbase-site.xml) 
14:11:16.681 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(hbase-site.xml): calling componentClassLoader.getResources()
14:11:16.681 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(hbase-site.xml): null
14:11:16.682 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(/hbase-site.xml) 
14:11:16.682 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(/hbase-site.xml): calling componentClassLoader.getResources()
14:11:16.682 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(/hbase-site.xml): null
14:11:16.682 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(hdfs-site.xml) 
14:11:16.682 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(hdfs-site.xml): calling componentClassLoader.getResources()
14:11:16.682 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(hdfs-site.xml): null
14:11:16.682 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(/hdfs-site.xml) 
14:11:16.683 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(/hdfs-site.xml): calling componentClassLoader.getResources()
14:11:16.683 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(/hdfs-site.xml): null
14:11:16.683 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerPluginConfig -- <== addAuditResource(Service Type: kms)
14:11:16.683 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- ==> addResourceIfReadable(ranger-kms-security.xml)
14:11:16.683 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(ranger-kms-security.xml) 
14:11:16.683 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(ranger-kms-security.xml): calling componentClassLoader.getResources()
14:11:16.683 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(ranger-kms-security.xml): file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ranger-kms-security.xml
14:11:16.683 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- addResourceIfReadable(ranger-kms-security.xml): resource file is file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ranger-kms-security.xml
14:11:16.684 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- <== addResourceIfReadable(ranger-kms-security.xml), result=true
14:11:16.684 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- ==> addResourceIfReadable(ranger-kms-policymgr-ssl.xml)
14:11:16.684 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(ranger-kms-policymgr-ssl.xml) 
14:11:16.684 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(ranger-kms-policymgr-ssl.xml): calling componentClassLoader.getResources()
14:11:16.684 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(ranger-kms-policymgr-ssl.xml): file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ranger-kms-policymgr-ssl.xml
14:11:16.684 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- addResourceIfReadable(ranger-kms-policymgr-ssl.xml): resource file is file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ranger-kms-policymgr-ssl.xml
14:11:16.684 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- <== addResourceIfReadable(ranger-kms-policymgr-ssl.xml), result=true
14:11:16.686 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- ==> addResourceIfReadable(ranger-kms-kms-audit.xml)
14:11:16.686 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(ranger-kms-kms-audit.xml) 
14:11:16.686 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(ranger-kms-kms-audit.xml): calling componentClassLoader.getResources()
14:11:16.686 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(ranger-kms-kms-audit.xml): null
14:11:16.686 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(/ranger-kms-kms-audit.xml) 
14:11:16.687 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(/ranger-kms-kms-audit.xml): calling componentClassLoader.getResources()
14:11:16.687 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(/ranger-kms-kms-audit.xml): null
14:11:16.687 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- Conf file path ranger-kms-kms-audit.xml does not exists
14:11:16.687 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- addResourceIfReadable(ranger-kms-kms-audit.xml): couldn't find resource file location
14:11:16.687 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- <== addResourceIfReadable(ranger-kms-kms-audit.xml), result=false
14:11:16.687 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- ==> addResourceIfReadable(ranger-kms-kms-security.xml)
14:11:16.687 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(ranger-kms-kms-security.xml) 
14:11:16.687 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(ranger-kms-kms-security.xml): calling componentClassLoader.getResources()
14:11:16.687 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(ranger-kms-kms-security.xml): null
14:11:16.688 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(/ranger-kms-kms-security.xml) 
14:11:16.688 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(/ranger-kms-kms-security.xml): calling componentClassLoader.getResources()
14:11:16.688 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(/ranger-kms-kms-security.xml): null
14:11:16.688 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- Conf file path ranger-kms-kms-security.xml does not exists
14:11:16.688 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- addResourceIfReadable(ranger-kms-kms-security.xml): couldn't find resource file location
14:11:16.688 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- <== addResourceIfReadable(ranger-kms-kms-security.xml), result=false
14:11:16.688 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- ==> addResourceIfReadable(ranger-kms-kms-policymgr-ssl.xml)
14:11:16.688 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(ranger-kms-kms-policymgr-ssl.xml) 
14:11:16.688 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(ranger-kms-kms-policymgr-ssl.xml): calling componentClassLoader.getResources()
14:11:16.689 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(ranger-kms-kms-policymgr-ssl.xml): null
14:11:16.689 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(/ranger-kms-kms-policymgr-ssl.xml) 
14:11:16.689 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(/ranger-kms-kms-policymgr-ssl.xml): calling componentClassLoader.getResources()
14:11:16.689 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(/ranger-kms-kms-policymgr-ssl.xml): null
14:11:16.689 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- Conf file path ranger-kms-kms-policymgr-ssl.xml does not exists
14:11:16.689 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- addResourceIfReadable(ranger-kms-kms-policymgr-ssl.xml): couldn't find resource file location
14:11:16.689 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerConfiguration -- <== addResourceIfReadable(ranger-kms-kms-policymgr-ssl.xml), result=false
14:11:16.689 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.StringUtil)
14:11:16.689 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.StringUtil): calling childClassLoader.findClass()
14:11:16.689 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.StringUtil)
14:11:16.689 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.StringUtil): calling childClassLoader().findClass() 
14:11:16.690 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.StringUtil): class org.apache.ranger.authorization.utils.StringUtil
14:11:16.690 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.StringUtil): class org.apache.ranger.authorization.utils.StringUtil
14:11:16.691 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.OutputStream)
14:11:16.691 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.OutputStream): calling childClassLoader.findClass()
14:11:16.691 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.OutputStream): class java.io.OutputStream
14:11:16.691 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.ByteArrayOutputStream)
14:11:16.691 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.ByteArrayOutputStream): calling childClassLoader.findClass()
14:11:16.691 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.ByteArrayOutputStream): class java.io.ByteArrayOutputStream
14:11:16.691 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.InputStream)
14:11:16.691 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.InputStream): calling childClassLoader.findClass()
14:11:16.691 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.InputStream): class java.io.InputStream
14:11:16.691 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.ByteArrayInputStream)
14:11:16.691 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.ByteArrayInputStream): calling childClassLoader.findClass()
14:11:16.691 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.ByteArrayInputStream): class java.io.ByteArrayInputStream
14:11:16.691 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.TimeZone)
14:11:16.691 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.TimeZone): calling childClassLoader.findClass()
14:11:16.691 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.TimeZone): class java.util.TimeZone
14:11:16.691 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerPluginConfig -- ranger.plugin.kms.use.x-forwarded-for.ipaddress:false
14:11:16.691 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerPluginConfig -- ranger.plugin.kms.trusted.proxy.ipaddresses:[null]
14:11:16.691 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineOptions)
14:11:16.691 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineOptions): calling childClassLoader.findClass()
14:11:16.691 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineOptions)
14:11:16.692 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineOptions): calling childClassLoader().findClass() 
14:11:16.692 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineOptions): class org.apache.ranger.plugin.policyengine.RangerPolicyEngineOptions
14:11:16.692 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineOptions): class org.apache.ranger.plugin.policyengine.RangerPolicyEngineOptions
14:11:16.693 [main] INFO org.apache.ranger.authorization.hadoop.config.RangerPluginConfig -- PolicyEngineOptions: { evaluatorType: auto, evaluateDelegateAdminOnly: false, disableContextEnrichers: false, disableCustomConditions: false, disableTagPolicyEvaluation: false, disablePolicyRefresher: false, disableTagRetriever: false, disableUserStoreRetriever: false, enableTagEnricherWithLocalRefresher: false, enableUserStoreEnricherWithLocalRefresher: false, disableTrieLookupPrefilter: false, optimizeTrieForRetrieval: false, cacheAuditResult: false, disableRoleResolution: true, optimizeTrieForSpace: false, optimizeTagTrieForRetrieval: false, optimizeTagTrieForSpace: false, enableResourceMatcherReuse: true }
14:11:16.693 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.DownloadTrigger)
14:11:16.693 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.DownloadTrigger): calling childClassLoader.findClass()
14:11:16.693 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.DownloadTrigger)
14:11:16.693 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.DownloadTrigger): calling childClassLoader().findClass() 
14:11:16.693 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.DownloadTrigger): class org.apache.ranger.plugin.util.DownloadTrigger
14:11:16.693 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.DownloadTrigger): class org.apache.ranger.plugin.util.DownloadTrigger
14:11:16.694 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPluginContext)
14:11:16.694 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPluginContext): calling childClassLoader.findClass()
14:11:16.694 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPluginContext)
14:11:16.694 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPluginContext): calling childClassLoader().findClass() 
14:11:16.694 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPluginContext): class org.apache.ranger.plugin.policyengine.RangerPluginContext
14:11:16.694 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPluginContext): class org.apache.ranger.plugin.policyengine.RangerPluginContext
14:11:16.695 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.locks.ReentrantReadWriteLock)
14:11:16.695 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.locks.ReentrantReadWriteLock): calling childClassLoader.findClass()
14:11:16.695 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.locks.ReentrantReadWriteLock): class java.util.concurrent.locks.ReentrantReadWriteLock
14:11:16.695 [main] INFO org.apache.ranger.plugin.service.RangerBasePlugin -- ranger.plugin.kms.null_safe.supplier=v2
14:11:16.695 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject)
14:11:16.695 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject): calling childClassLoader.findClass()
14:11:16.695 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject)
14:11:16.695 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject): calling childClassLoader().findClass() 
14:11:16.696 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.Serializable)
14:11:16.696 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.Serializable): calling childClassLoader.findClass()
14:11:16.696 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.Serializable): interface java.io.Serializable
14:11:16.696 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject): class org.apache.ranger.plugin.model.RangerBaseModelObject
14:11:16.696 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject): class org.apache.ranger.plugin.model.RangerBaseModelObject
14:11:16.696 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplier)
14:11:16.696 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplier): calling childClassLoader.findClass()
14:11:16.696 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplier)
14:11:16.696 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplier): calling childClassLoader().findClass() 
14:11:16.696 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplier): class org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplier
14:11:16.696 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplier): class org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplier
14:11:16.696 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV1)
14:11:16.696 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV1): calling childClassLoader.findClass()
14:11:16.696 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV1)
14:11:16.697 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV1): calling childClassLoader().findClass() 
14:11:16.697 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV1): class org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV1
14:11:16.697 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV1): class org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV1
14:11:16.697 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV2)
14:11:16.697 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV2): calling childClassLoader.findClass()
14:11:16.697 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV2)
14:11:16.697 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV2): calling childClassLoader().findClass() 
14:11:16.697 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV2): class org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV2
14:11:16.697 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV2): class org.apache.ranger.plugin.model.RangerBaseModelObject$NullSafeSupplierV2
14:11:16.697 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.PerfDataRecorder)
14:11:16.697 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.PerfDataRecorder): calling childClassLoader.findClass()
14:11:16.697 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.PerfDataRecorder)
14:11:16.698 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.PerfDataRecorder): calling childClassLoader().findClass() 
14:11:16.698 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.PerfDataRecorder): class org.apache.ranger.plugin.util.PerfDataRecorder
14:11:16.698 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.PerfDataRecorder): class org.apache.ranger.plugin.util.PerfDataRecorder
14:11:16.698 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Thread)
14:11:16.698 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Thread): calling childClassLoader.findClass()
14:11:16.698 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Thread): class java.lang.Thread
14:11:16.698 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.PerfDataRecorder$StatisticsDumper)
14:11:16.698 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.PerfDataRecorder$StatisticsDumper): calling childClassLoader.findClass()
14:11:16.698 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.PerfDataRecorder$StatisticsDumper)
14:11:16.699 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.PerfDataRecorder$StatisticsDumper): calling childClassLoader().findClass() 
14:11:16.699 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.PerfDataRecorder$StatisticsDumper): class org.apache.ranger.plugin.util.PerfDataRecorder$StatisticsDumper
14:11:16.699 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.PerfDataRecorder$StatisticsDumper): class org.apache.ranger.plugin.util.PerfDataRecorder$StatisticsDumper
14:11:16.699 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.CollectionUtils)
14:11:16.699 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.CollectionUtils): calling childClassLoader.findClass()
14:11:16.699 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.CollectionUtils)
14:11:16.699 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.CollectionUtils): calling childClassLoader().findClass() 
14:11:16.700 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.CollectionUtils): class org.apache.commons.collections.CollectionUtils
14:11:16.700 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.CollectionUtils): class org.apache.commons.collections.CollectionUtils
14:11:16.700 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.IllegalArgumentException)
14:11:16.700 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.IllegalArgumentException): calling childClassLoader.findClass()
14:11:16.700 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.IllegalArgumentException): class java.lang.IllegalArgumentException
14:11:16.700 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.NullPointerException)
14:11:16.700 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.NullPointerException): calling childClassLoader.findClass()
14:11:16.700 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.NullPointerException): class java.lang.NullPointerException
14:11:16.701 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.IndexOutOfBoundsException)
14:11:16.701 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.IndexOutOfBoundsException): calling childClassLoader.findClass()
14:11:16.701 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.IndexOutOfBoundsException): class java.lang.IndexOutOfBoundsException
14:11:16.701 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Integer)
14:11:16.701 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Integer): calling childClassLoader.findClass()
14:11:16.701 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Integer): class java.lang.Integer
14:11:16.701 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.ArrayList)
14:11:16.701 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.ArrayList): calling childClassLoader.findClass()
14:11:16.701 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.ArrayList): class java.util.ArrayList
14:11:16.701 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.collection.UnmodifiableCollection)
14:11:16.701 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.collection.UnmodifiableCollection): calling childClassLoader.findClass()
14:11:16.701 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.collection.UnmodifiableCollection)
14:11:16.701 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.collection.UnmodifiableCollection): calling childClassLoader().findClass() 
14:11:16.702 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.Unmodifiable)
14:11:16.702 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.Unmodifiable): calling childClassLoader.findClass()
14:11:16.702 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.Unmodifiable)
14:11:16.702 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.Unmodifiable): calling childClassLoader().findClass() 
14:11:16.702 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.Unmodifiable): interface org.apache.commons.collections.Unmodifiable
14:11:16.702 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.Unmodifiable): interface org.apache.commons.collections.Unmodifiable
14:11:16.702 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.collection.AbstractSerializableCollectionDecorator)
14:11:16.702 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.collection.AbstractSerializableCollectionDecorator): calling childClassLoader.findClass()
14:11:16.702 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.collection.AbstractSerializableCollectionDecorator)
14:11:16.702 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.collection.AbstractSerializableCollectionDecorator): calling childClassLoader().findClass() 
14:11:16.702 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.collection.AbstractCollectionDecorator)
14:11:16.702 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.collection.AbstractCollectionDecorator): calling childClassLoader.findClass()
14:11:16.702 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.collection.AbstractCollectionDecorator)
14:11:16.702 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.collection.AbstractCollectionDecorator): calling childClassLoader().findClass() 
14:11:16.702 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.collection.AbstractCollectionDecorator): class org.apache.commons.collections.collection.AbstractCollectionDecorator
14:11:16.702 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.collection.AbstractCollectionDecorator): class org.apache.commons.collections.collection.AbstractCollectionDecorator
14:11:16.702 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.collection.AbstractSerializableCollectionDecorator): class org.apache.commons.collections.collection.AbstractSerializableCollectionDecorator
14:11:16.703 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.collection.AbstractSerializableCollectionDecorator): class org.apache.commons.collections.collection.AbstractSerializableCollectionDecorator
14:11:16.703 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.collection.UnmodifiableCollection): class org.apache.commons.collections.collection.UnmodifiableCollection
14:11:16.703 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.collection.UnmodifiableCollection): class org.apache.commons.collections.collection.UnmodifiableCollection
14:11:16.703 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.UnsupportedOperationException)
14:11:16.703 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.UnsupportedOperationException): calling childClassLoader.findClass()
14:11:16.703 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.UnsupportedOperationException): class java.lang.UnsupportedOperationException
14:11:16.703 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerPluginConfig -- superUsers=[], superGroups=[]
14:11:16.703 [main] DEBUG org.apache.ranger.authorization.hadoop.config.RangerPluginConfig -- auditExcludedUsers=[], auditExcludedGroups=[], auditExcludedRoles=[]
14:11:16.703 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator)
14:11:16.703 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator): calling childClassLoader.findClass()
14:11:16.703 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator)
14:11:16.703 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator): calling childClassLoader().findClass() 
14:11:16.704 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator): class org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator
14:11:16.705 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator): class org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator
14:11:16.705 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.script.ScriptException)
14:11:16.705 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.script.ScriptException): calling childClassLoader.findClass()
14:11:16.705 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(javax.script.ScriptException)
14:11:16.705 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.script.ScriptException): calling childClassLoader().findClass() 
14:11:16.705 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.script.ScriptException): calling componentClassLoader.findClass()
14:11:16.705 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.script.ScriptException): calling componentClassLoader.loadClass()
14:11:16.705 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.script.ScriptException): class javax.script.ScriptException
14:11:16.705 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$1)
14:11:16.705 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$1): calling childClassLoader.findClass()
14:11:16.705 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$1)
14:11:16.705 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$1): calling childClassLoader().findClass() 
14:11:16.705 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$1): class org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$1
14:11:16.706 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$1): class org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$1
14:11:16.706 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Comparator)
14:11:16.706 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Comparator): calling childClassLoader.findClass()
14:11:16.706 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Comparator): interface java.util.Comparator
14:11:16.706 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.text.ParseException)
14:11:16.706 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.text.ParseException): calling childClassLoader.findClass()
14:11:16.706 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.text.ParseException): class java.text.ParseException
14:11:16.710 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.regex.Pattern)
14:11:16.710 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.regex.Pattern): calling childClassLoader.findClass()
14:11:16.710 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.regex.Pattern): class java.util.regex.Pattern
14:11:16.711 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.MacroProcessor)
14:11:16.711 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.MacroProcessor): calling childClassLoader.findClass()
14:11:16.711 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.MacroProcessor)
14:11:16.711 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.MacroProcessor): calling childClassLoader().findClass() 
14:11:16.711 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.MacroProcessor): class org.apache.ranger.plugin.util.MacroProcessor
14:11:16.711 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.MacroProcessor): class org.apache.ranger.plugin.util.MacroProcessor
14:11:16.712 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Iterator)
14:11:16.712 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Iterator): calling childClassLoader.findClass()
14:11:16.712 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Iterator): interface java.util.Iterator
14:11:16.712 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$2)
14:11:16.712 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$2): calling childClassLoader.findClass()
14:11:16.712 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$2)
14:11:16.712 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$2): calling childClassLoader().findClass() 
14:11:16.712 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$2): class org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$2
14:11:16.712 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$2): class org.apache.ranger.plugin.policyengine.RangerRequestScriptEvaluator$2
14:11:16.713 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Arrays)
14:11:16.713 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Arrays): calling childClassLoader.findClass()
14:11:16.713 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Arrays): class java.util.Arrays
14:11:16.713 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditHandler)
14:11:16.713 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditHandler): calling childClassLoader.findClass()
14:11:16.713 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditHandler)
14:11:16.713 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditHandler): calling childClassLoader().findClass() 
14:11:16.713 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditHandler): interface org.apache.ranger.audit.provider.AuditHandler
14:11:16.713 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditHandler): interface org.apache.ranger.audit.provider.AuditHandler
14:11:16.714 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AuditProviderFactory: creating..
14:11:16.714 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.DummyAuditProvider)
14:11:16.714 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.DummyAuditProvider): calling childClassLoader.findClass()
14:11:16.714 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.DummyAuditProvider)
14:11:16.714 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.DummyAuditProvider): calling childClassLoader().findClass() 
14:11:16.714 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.DummyAuditProvider): class org.apache.ranger.audit.provider.DummyAuditProvider
14:11:16.714 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.DummyAuditProvider): class org.apache.ranger.audit.provider.DummyAuditProvider
14:11:16.714 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AuditProviderFactory: initializing..
14:11:16.714 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Properties)
14:11:16.714 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Properties): calling childClassLoader.findClass()
14:11:16.714 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Properties): class java.util.Properties
14:11:16.714 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: ranger.plugin.kms.policy.rest.url=http://127.25.254.212:40687
14:11:16.714 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: xasecure.policymgr.clientssl.keystore=/etc/ranger/kms/conf/ranger-plugin-keystore.jks
14:11:16.714 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: ranger.plugin.kms.policy.rest.client.connection.timeoutMs=120000
14:11:16.714 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: ranger.plugin.kms.policy.rest.client.read.timeoutMs=30000
14:11:16.714 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: xasecure.policymgr.clientssl.truststore=/etc/ranger/kms/conf/ranger-plugin-truststore.jks
14:11:16.714 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: ranger.plugin.kms.policy.pollIntervalMs=30000
14:11:16.714 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: ranger.plugin.kms.service.name=kms
14:11:16.714 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: xasecure.policymgr.clientssl.truststore.credential.file=jceks://file/etc/ranger/kmsdev/cred.jceks
14:11:16.714 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: ranger.plugin.kms.policy.cache.dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/policycache
14:11:16.714 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: ranger.plugin.kms.policy.source.impl=org.apache.ranger.admin.client.RangerAdminRESTClient
14:11:16.714 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: xasecure.policymgr.clientssl.keystore.credential.file=jceks://file/etc/ranger/kmsdev/cred.jceks
14:11:16.714 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- AUDIT PROPERTY: ranger.plugin.kms.policy.rest.ssl.config.file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ranger-kms-policymgr-ssl.xml
14:11:16.714 [main] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- No v3 audit configuration found. Trying v2 audit configurations
14:11:16.714 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditProviderFactory$JVMShutdownHook)
14:11:16.714 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditProviderFactory$JVMShutdownHook): calling childClassLoader.findClass()
14:11:16.714 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditProviderFactory$JVMShutdownHook)
14:11:16.714 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditProviderFactory$JVMShutdownHook): calling childClassLoader().findClass() 
14:11:16.715 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditProviderFactory$JVMShutdownHook): class org.apache.ranger.audit.provider.AuditProviderFactory$JVMShutdownHook
14:11:16.715 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditProviderFactory$JVMShutdownHook): class org.apache.ranger.audit.provider.AuditProviderFactory$JVMShutdownHook
14:11:16.715 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.Semaphore)
14:11:16.715 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.Semaphore): calling childClassLoader.findClass()
14:11:16.715 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.Semaphore): class java.util.concurrent.Semaphore
14:11:16.715 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicBoolean)
14:11:16.715 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicBoolean): calling childClassLoader.findClass()
14:11:16.715 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicBoolean): class java.util.concurrent.atomic.AtomicBoolean
14:11:16.715 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditProviderFactory$RangerAsyncAuditCleanup)
14:11:16.715 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditProviderFactory$RangerAsyncAuditCleanup): calling childClassLoader.findClass()
14:11:16.715 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditProviderFactory$RangerAsyncAuditCleanup)
14:11:16.715 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditProviderFactory$RangerAsyncAuditCleanup): calling childClassLoader().findClass() 
14:11:16.715 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.provider.AuditProviderFactory$RangerAsyncAuditCleanup): class org.apache.ranger.audit.provider.AuditProviderFactory$RangerAsyncAuditCleanup
14:11:16.715 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.provider.AuditProviderFactory$RangerAsyncAuditCleanup): class org.apache.ranger.audit.provider.AuditProviderFactory$RangerAsyncAuditCleanup
14:11:16.716 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.util.ShutdownHookManager)
14:11:16.716 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.util.ShutdownHookManager): calling childClassLoader.findClass()
14:11:16.716 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.util.ShutdownHookManager)
14:11:16.716 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.util.ShutdownHookManager): calling childClassLoader().findClass() 
14:11:16.716 [Ranger async Audit cleanup] INFO org.apache.ranger.audit.provider.AuditProviderFactory -- RangerAsyncAuditCleanup: Waiting to audit cleanup start signal
14:11:16.716 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.util.ShutdownHookManager): calling componentClassLoader.findClass()
14:11:16.716 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.util.ShutdownHookManager): calling componentClassLoader.loadClass()
14:11:16.717 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.util.ShutdownHookManager): class org.apache.hadoop.util.ShutdownHookManager
14:11:16.722 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(core-default.xml) 
14:11:16.722 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(core-default.xml): calling componentClassLoader.getResources()
14:11:16.722 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(core-default.xml): jar:file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar!/core-default.xml
14:11:16.728 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(core-site.xml) 
14:11:16.728 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(core-site.xml): calling componentClassLoader.getResources()
14:11:16.729 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(core-site.xml): file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/core-site.xml
14:11:16.735 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.PolicyRefresher)
14:11:16.735 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.PolicyRefresher): calling childClassLoader.findClass()
14:11:16.735 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.PolicyRefresher)
14:11:16.735 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.PolicyRefresher): calling childClassLoader().findClass() 
14:11:16.736 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.PolicyRefresher): class org.apache.ranger.plugin.util.PolicyRefresher
14:11:16.736 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.PolicyRefresher): class org.apache.ranger.plugin.util.PolicyRefresher
14:11:16.736 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.BlockingQueue)
14:11:16.736 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.BlockingQueue): calling childClassLoader.findClass()
14:11:16.736 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.BlockingQueue): interface java.util.concurrent.BlockingQueue
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.Reader)
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.Reader): calling childClassLoader.findClass()
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.Reader): class java.io.Reader
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.FileReader)
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.FileReader): calling childClassLoader.findClass()
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.FileReader): class java.io.FileReader
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.IllegalStateException)
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.IllegalStateException): calling childClassLoader.findClass()
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.IllegalStateException): class java.lang.IllegalStateException
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.TimerTask)
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.TimerTask): calling childClassLoader.findClass()
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.TimerTask): class java.util.TimerTask
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.DownloaderTask)
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.DownloaderTask): calling childClassLoader.findClass()
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.DownloaderTask)
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.DownloaderTask): calling childClassLoader().findClass() 
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.DownloaderTask): class org.apache.ranger.plugin.util.DownloaderTask
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.DownloaderTask): class org.apache.ranger.plugin.util.DownloaderTask
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.SecurityException)
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.SecurityException): calling childClassLoader.findClass()
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.SecurityException): class java.lang.SecurityException
14:11:16.737 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.Writer)
14:11:16.738 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.Writer): calling childClassLoader.findClass()
14:11:16.738 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.Writer): class java.io.Writer
14:11:16.738 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.FileWriter)
14:11:16.738 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.FileWriter): calling childClassLoader.findClass()
14:11:16.738 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.FileWriter): class java.io.FileWriter
14:11:16.739 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerServiceNotFoundException)
14:11:16.739 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerServiceNotFoundException): calling childClassLoader.findClass()
14:11:16.739 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerServiceNotFoundException)
14:11:16.739 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerServiceNotFoundException): calling childClassLoader().findClass() 
14:11:16.739 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerServiceNotFoundException): class org.apache.ranger.plugin.util.RangerServiceNotFoundException
14:11:16.739 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerServiceNotFoundException): class org.apache.ranger.plugin.util.RangerServiceNotFoundException
14:11:16.740 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.LinkedBlockingQueue)
14:11:16.740 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.LinkedBlockingQueue): calling childClassLoader.findClass()
14:11:16.740 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.LinkedBlockingQueue): class java.util.concurrent.LinkedBlockingQueue
14:11:16.740 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- ==> PolicyRefresher(serviceName=kms).PolicyRefresher()
14:11:16.740 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPluginContext -- ==> RangerBasePlugin.createAdminClient(kms, kms, ranger.plugin.kms)
14:11:16.740 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPluginContext -- Value for property[ranger.plugin.kms.policy.source.impl] was [org.apache.ranger.admin.client.RangerAdminRESTClient].
14:11:16.740 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.RangerAdminRESTClient)
14:11:16.740 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.RangerAdminRESTClient): calling childClassLoader.findClass()
14:11:16.740 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.RangerAdminRESTClient)
14:11:16.740 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.RangerAdminRESTClient): calling childClassLoader().findClass() 
14:11:16.741 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.AbstractRangerAdminClient)
14:11:16.741 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.AbstractRangerAdminClient): calling childClassLoader.findClass()
14:11:16.741 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.AbstractRangerAdminClient)
14:11:16.741 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.AbstractRangerAdminClient): calling childClassLoader().findClass() 
14:11:16.742 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.AbstractRangerAdminClient): class org.apache.ranger.admin.client.AbstractRangerAdminClient
14:11:16.742 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.AbstractRangerAdminClient): class org.apache.ranger.admin.client.AbstractRangerAdminClient
14:11:16.742 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.RangerAdminRESTClient): class org.apache.ranger.admin.client.RangerAdminRESTClient
14:11:16.742 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.RangerAdminRESTClient): class org.apache.ranger.admin.client.RangerAdminRESTClient
14:11:16.742 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.core.type.TypeReference)
14:11:16.742 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.core.type.TypeReference): calling childClassLoader.findClass()
14:11:16.742 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.core.type.TypeReference)
14:11:16.742 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.core.type.TypeReference): calling childClassLoader().findClass() 
14:11:16.742 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.core.type.TypeReference): calling componentClassLoader.findClass()
14:11:16.742 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.core.type.TypeReference): calling componentClassLoader.loadClass()
14:11:16.743 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.core.type.TypeReference): class com.fasterxml.jackson.core.type.TypeReference
14:11:16.743 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.RangerAdminRESTClient$1)
14:11:16.743 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.RangerAdminRESTClient$1): calling childClassLoader.findClass()
14:11:16.743 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.RangerAdminRESTClient$1)
14:11:16.743 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.RangerAdminRESTClient$1): calling childClassLoader().findClass() 
14:11:16.744 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.admin.client.RangerAdminRESTClient$1): class org.apache.ranger.admin.client.RangerAdminRESTClient$1
14:11:16.744 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.admin.client.RangerAdminRESTClient$1): class org.apache.ranger.admin.client.RangerAdminRESTClient$1
14:11:16.744 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.UnsupportedEncodingException)
14:11:16.744 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.UnsupportedEncodingException): calling childClassLoader.findClass()
14:11:16.744 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.UnsupportedEncodingException): class java.io.UnsupportedEncodingException
14:11:16.744 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.security.AccessControlException)
14:11:16.744 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.AccessControlException): calling childClassLoader.findClass()
14:11:16.744 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.security.AccessControlException)
14:11:16.744 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.AccessControlException): calling childClassLoader().findClass() 
14:11:16.744 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.security.AccessControlException): calling componentClassLoader.findClass()
14:11:16.744 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.security.AccessControlException): calling componentClassLoader.loadClass()
14:11:16.744 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.security.AccessControlException): class org.apache.hadoop.security.AccessControlException
14:11:16.745 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.ws.rs.core.Cookie)
14:11:16.745 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.ws.rs.core.Cookie): calling childClassLoader.findClass()
14:11:16.745 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(javax.ws.rs.core.Cookie)
14:11:16.745 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.ws.rs.core.Cookie): calling childClassLoader().findClass() 
14:11:16.745 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.ws.rs.core.Cookie): calling componentClassLoader.findClass()
14:11:16.745 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.ws.rs.core.Cookie): calling componentClassLoader.loadClass()
14:11:16.746 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.ws.rs.core.Cookie): class javax.ws.rs.core.Cookie
14:11:16.746 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.ws.rs.core.NewCookie)
14:11:16.746 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.ws.rs.core.NewCookie): calling childClassLoader.findClass()
14:11:16.746 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(javax.ws.rs.core.NewCookie)
14:11:16.746 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.ws.rs.core.NewCookie): calling childClassLoader().findClass() 
14:11:16.746 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.ws.rs.core.NewCookie): calling componentClassLoader.findClass()
14:11:16.746 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.ws.rs.core.NewCookie): calling componentClassLoader.loadClass()
14:11:16.747 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.ws.rs.core.NewCookie): class javax.ws.rs.core.NewCookie
14:11:16.748 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRESTUtils)
14:11:16.748 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRESTUtils): calling childClassLoader.findClass()
14:11:16.748 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRESTUtils)
14:11:16.748 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRESTUtils): calling childClassLoader().findClass() 
14:11:16.748 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRESTUtils): class org.apache.ranger.plugin.util.RangerRESTUtils
14:11:16.748 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRESTUtils): class org.apache.ranger.plugin.util.RangerRESTUtils
14:11:16.749 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.kstruct.gethostname4j.Hostname)
14:11:16.749 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.kstruct.gethostname4j.Hostname): calling childClassLoader.findClass()
14:11:16.749 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.kstruct.gethostname4j.Hostname)
14:11:16.749 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.kstruct.gethostname4j.Hostname): calling childClassLoader().findClass() 
14:11:16.749 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.kstruct.gethostname4j.Hostname): class com.kstruct.gethostname4j.Hostname
14:11:16.749 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.kstruct.gethostname4j.Hostname): class com.kstruct.gethostname4j.Hostname
14:11:16.749 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Platform)
14:11:16.749 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Platform): calling childClassLoader.findClass()
14:11:16.749 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Platform)
14:11:16.749 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Platform): calling childClassLoader().findClass() 
14:11:16.750 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Platform): class com.sun.jna.Platform
14:11:16.750 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Platform): class com.sun.jna.Platform
14:11:16.750 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.ClassNotFoundException)
14:11:16.750 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.ClassNotFoundException): calling childClassLoader.findClass()
14:11:16.750 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.ClassNotFoundException): class java.lang.ClassNotFoundException
14:11:16.750 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.Buffer)
14:11:16.750 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.Buffer): calling childClassLoader.findClass()
14:11:16.750 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.Buffer): class java.nio.Buffer
14:11:16.750 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.kstruct.gethostname4j.Hostname$UnixCLibrary)
14:11:16.750 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.kstruct.gethostname4j.Hostname$UnixCLibrary): calling childClassLoader.findClass()
14:11:16.750 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.kstruct.gethostname4j.Hostname$UnixCLibrary)
14:11:16.750 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.kstruct.gethostname4j.Hostname$UnixCLibrary): calling childClassLoader().findClass() 
14:11:16.750 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Library)
14:11:16.750 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Library): calling childClassLoader.findClass()
14:11:16.750 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Library)
14:11:16.750 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Library): calling childClassLoader().findClass() 
14:11:16.751 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Library): interface com.sun.jna.Library
14:11:16.751 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Library): interface com.sun.jna.Library
14:11:16.751 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.kstruct.gethostname4j.Hostname$UnixCLibrary): interface com.kstruct.gethostname4j.Hostname$UnixCLibrary
14:11:16.751 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.kstruct.gethostname4j.Hostname$UnixCLibrary): interface com.kstruct.gethostname4j.Hostname$UnixCLibrary
14:11:16.751 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Native)
14:11:16.751 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Native): calling childClassLoader.findClass()
14:11:16.751 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Native)
14:11:16.751 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Native): calling childClassLoader().findClass() 
14:11:16.752 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Version)
14:11:16.752 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Version): calling childClassLoader.findClass()
14:11:16.753 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Version)
14:11:16.753 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Version): calling childClassLoader().findClass() 
14:11:16.753 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Version): interface com.sun.jna.Version
14:11:16.753 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Version): interface com.sun.jna.Version
14:11:16.753 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Native): class com.sun.jna.Native
14:11:16.753 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Native): class com.sun.jna.Native
14:11:16.753 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Callback$UncaughtExceptionHandler)
14:11:16.753 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Callback$UncaughtExceptionHandler): calling childClassLoader.findClass()
14:11:16.753 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Callback$UncaughtExceptionHandler)
14:11:16.753 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Callback$UncaughtExceptionHandler): calling childClassLoader().findClass() 
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Callback$UncaughtExceptionHandler): interface com.sun.jna.Callback$UncaughtExceptionHandler
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Callback$UncaughtExceptionHandler): interface com.sun.jna.Callback$UncaughtExceptionHandler
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Error)
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Error): calling childClassLoader.findClass()
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Error): class java.lang.Error
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Native$7)
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Native$7): calling childClassLoader.findClass()
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Native$7)
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Native$7): calling childClassLoader().findClass() 
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Native$7): class com.sun.jna.Native$7
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Native$7): class com.sun.jna.Native$7
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.reflect.InvocationHandler)
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.reflect.InvocationHandler): calling childClassLoader.findClass()
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.reflect.InvocationHandler): interface java.lang.reflect.InvocationHandler
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.NoSuchMethodError)
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.NoSuchMethodError): calling childClassLoader.findClass()
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.NoSuchMethodError): class java.lang.NoSuchMethodError
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.UnsatisfiedLinkError)
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.UnsatisfiedLinkError): calling childClassLoader.findClass()
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.UnsatisfiedLinkError): class java.lang.UnsatisfiedLinkError
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.charset.IllegalCharsetNameException)
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.charset.IllegalCharsetNameException): calling childClassLoader.findClass()
14:11:16.754 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.charset.IllegalCharsetNameException): class java.nio.charset.IllegalCharsetNameException
14:11:16.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.charset.UnsupportedCharsetException)
14:11:16.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.charset.UnsupportedCharsetException): calling childClassLoader.findClass()
14:11:16.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.charset.UnsupportedCharsetException): class java.nio.charset.UnsupportedCharsetException
14:11:16.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.NoSuchFieldException)
14:11:16.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.NoSuchFieldException): calling childClassLoader.findClass()
14:11:16.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.NoSuchFieldException): class java.lang.NoSuchFieldException
14:11:16.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.net.URISyntaxException)
14:11:16.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.net.URISyntaxException): calling childClassLoader.findClass()
14:11:16.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.net.URISyntaxException): class java.net.URISyntaxException
14:11:16.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.security.PrivilegedAction)
14:11:16.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.security.PrivilegedAction): calling childClassLoader.findClass()
14:11:16.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.security.PrivilegedAction): interface java.security.PrivilegedAction
14:11:16.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.FilenameFilter)
14:11:16.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.FilenameFilter): calling childClassLoader.findClass()
14:11:16.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.FilenameFilter): interface java.io.FilenameFilter
14:11:16.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.FromNativeContext)
14:11:16.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.FromNativeContext): calling childClassLoader.findClass()
14:11:16.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.FromNativeContext)
14:11:16.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.FromNativeContext): calling childClassLoader().findClass() 
14:11:16.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.FromNativeContext): class com.sun.jna.FromNativeContext
14:11:16.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.FromNativeContext): class com.sun.jna.FromNativeContext
14:11:16.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.MethodResultContext)
14:11:16.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.MethodResultContext): calling childClassLoader.findClass()
14:11:16.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.MethodResultContext)
14:11:16.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.MethodResultContext): calling childClassLoader().findClass() 
14:11:16.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.FunctionResultContext)
14:11:16.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.FunctionResultContext): calling childClassLoader.findClass()
14:11:16.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.FunctionResultContext)
14:11:16.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.FunctionResultContext): calling childClassLoader().findClass() 
14:11:16.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.FunctionResultContext): class com.sun.jna.FunctionResultContext
14:11:16.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.FunctionResultContext): class com.sun.jna.FunctionResultContext
14:11:16.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.MethodResultContext): class com.sun.jna.MethodResultContext
14:11:16.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.MethodResultContext): class com.sun.jna.MethodResultContext
14:11:16.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.logging.Logger)
14:11:16.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.logging.Logger): calling childClassLoader.findClass()
14:11:16.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.logging.Logger): class java.util.logging.Logger
14:11:16.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.charset.Charset)
14:11:16.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.charset.Charset): calling childClassLoader.findClass()
14:11:16.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.charset.Charset): class java.nio.charset.Charset
14:11:16.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Boolean)
14:11:16.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Boolean): calling childClassLoader.findClass()
14:11:16.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Boolean): class java.lang.Boolean
14:11:16.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.logging.Level)
14:11:16.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.logging.Level): calling childClassLoader.findClass()
14:11:16.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.logging.Level): class java.util.logging.Level
14:11:16.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.WeakHashMap)
14:11:16.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.WeakHashMap): calling childClassLoader.findClass()
14:11:16.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.WeakHashMap): class java.util.WeakHashMap
14:11:16.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Native$1)
14:11:16.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Native$1): calling childClassLoader.findClass()
14:11:16.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Native$1)
14:11:16.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Native$1): calling childClassLoader().findClass() 
14:11:16.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Native$1): class com.sun.jna.Native$1
14:11:16.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Native$1): class com.sun.jna.Native$1
14:11:16.758 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Native$5)
14:11:16.758 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Native$5): calling childClassLoader.findClass()
14:11:16.758 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Native$5)
14:11:16.758 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Native$5): calling childClassLoader().findClass() 
14:11:16.758 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Native$5): class com.sun.jna.Native$5
14:11:16.758 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Native$5): class com.sun.jna.Native$5
14:11:16.758 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(com/sun/jna/linux-x86-64/libjnidispatch.so) 
14:11:16.758 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(com/sun/jna/linux-x86-64/libjnidispatch.so): jar:file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/jna-5.7.0.jar!/com/sun/jna/linux-x86-64/libjnidispatch.so
14:11:16.758 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.net.URL)
14:11:16.758 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.net.URL): calling childClassLoader.findClass()
14:11:16.758 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.net.URL): class java.net.URL
14:11:16.759 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(com/sun/jna/linux-x86-64/libjnidispatch.so) 
14:11:16.759 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(com/sun/jna/linux-x86-64/libjnidispatch.so): jar:file:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/ranger-kms-plugin-impl/jna-5.7.0.jar!/com/sun/jna/linux-x86-64/libjnidispatch.so
14:11:16.760 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.FileOutputStream)
14:11:16.760 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.FileOutputStream): calling childClassLoader.findClass()
14:11:16.760 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.FileOutputStream): class java.io.FileOutputStream
14:11:16.762 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.reflect.Method)
14:11:16.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.reflect.Method): calling childClassLoader.findClass()
14:11:16.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.reflect.Method): class java.lang.reflect.Method
14:11:16.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.ByteBuffer)
14:11:16.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.ByteBuffer): calling childClassLoader.findClass()
14:11:16.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.ByteBuffer): class java.nio.ByteBuffer
14:11:16.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.CharBuffer)
14:11:16.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.CharBuffer): calling childClassLoader.findClass()
14:11:16.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.CharBuffer): class java.nio.CharBuffer
14:11:16.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.ShortBuffer)
14:11:16.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.ShortBuffer): calling childClassLoader.findClass()
14:11:16.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.ShortBuffer): class java.nio.ShortBuffer
14:11:16.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.IntBuffer)
14:11:16.764 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.IntBuffer): calling childClassLoader.findClass()
14:11:16.764 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.IntBuffer): class java.nio.IntBuffer
14:11:16.764 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.LongBuffer)
14:11:16.764 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.LongBuffer): calling childClassLoader.findClass()
14:11:16.764 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.LongBuffer): class java.nio.LongBuffer
14:11:16.764 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.FloatBuffer)
14:11:16.764 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.FloatBuffer): calling childClassLoader.findClass()
14:11:16.764 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.FloatBuffer): class java.nio.FloatBuffer
14:11:16.764 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.nio.DoubleBuffer)
14:11:16.764 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.nio.DoubleBuffer): calling childClassLoader.findClass()
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.nio.DoubleBuffer): class java.nio.DoubleBuffer
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Void)
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Void): calling childClassLoader.findClass()
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Void): class java.lang.Void
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Byte)
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Byte): calling childClassLoader.findClass()
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Byte): class java.lang.Byte
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Character)
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Character): calling childClassLoader.findClass()
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Character): class java.lang.Character
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Short)
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Short): calling childClassLoader.findClass()
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Short): class java.lang.Short
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Long)
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Long): calling childClassLoader.findClass()
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Long): class java.lang.Long
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Float)
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Float): calling childClassLoader.findClass()
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Float): class java.lang.Float
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Double)
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Double): calling childClassLoader.findClass()
14:11:16.765 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Double): class java.lang.Double
14:11:16.766 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Pointer)
14:11:16.766 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Pointer): calling childClassLoader.findClass()
14:11:16.766 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Pointer)
14:11:16.766 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Pointer): calling childClassLoader().findClass() 
14:11:16.767 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Pointer): class com.sun.jna.Pointer
14:11:16.767 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Pointer): class com.sun.jna.Pointer
14:11:16.767 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.StringWriter)
14:11:16.767 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.StringWriter): calling childClassLoader.findClass()
14:11:16.767 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.StringWriter): class java.io.StringWriter
14:11:16.767 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Pointer$Opaque)
14:11:16.767 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Pointer$Opaque): calling childClassLoader.findClass()
14:11:16.767 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Pointer$Opaque)
14:11:16.767 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Pointer$Opaque): calling childClassLoader().findClass() 
14:11:16.768 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Pointer$Opaque): class com.sun.jna.Pointer$Opaque
14:11:16.768 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Pointer$Opaque): class com.sun.jna.Pointer$Opaque
14:11:16.768 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Structure)
14:11:16.768 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Structure): calling childClassLoader.findClass()
14:11:16.768 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Structure)
14:11:16.768 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Structure): calling childClassLoader().findClass() 
14:11:16.769 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Structure): class com.sun.jna.Structure
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Structure): class com.sun.jna.Structure
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Structure$1)
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Structure$1): calling childClassLoader.findClass()
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Structure$1)
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Structure$1): calling childClassLoader().findClass() 
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Structure$1): class com.sun.jna.Structure$1
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Structure$1): class com.sun.jna.Structure$1
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Structure$2)
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Structure$2): calling childClassLoader.findClass()
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Structure$2)
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Structure$2): calling childClassLoader().findClass() 
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Structure$2): class com.sun.jna.Structure$2
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Structure$2): class com.sun.jna.Structure$2
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Structure$3)
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Structure$3): calling childClassLoader.findClass()
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Structure$3)
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Structure$3): calling childClassLoader().findClass() 
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Structure$3): class com.sun.jna.Structure$3
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Structure$3): class com.sun.jna.Structure$3
14:11:16.770 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Memory)
14:11:16.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Memory): calling childClassLoader.findClass()
14:11:16.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Memory)
14:11:16.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Memory): calling childClassLoader().findClass() 
14:11:16.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Memory): class com.sun.jna.Memory
14:11:16.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Memory): class com.sun.jna.Memory
14:11:16.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.InstantiationException)
14:11:16.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.InstantiationException): calling childClassLoader.findClass()
14:11:16.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.InstantiationException): class java.lang.InstantiationException
14:11:16.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.IllegalAccessException)
14:11:16.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.IllegalAccessException): calling childClassLoader.findClass()
14:11:16.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.IllegalAccessException): class java.lang.IllegalAccessException
14:11:16.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.reflect.InvocationTargetException)
14:11:16.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.reflect.InvocationTargetException): calling childClassLoader.findClass()
14:11:16.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.reflect.InvocationTargetException): class java.lang.reflect.InvocationTargetException
14:11:16.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.NoSuchMethodException)
14:11:16.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.NoSuchMethodException): calling childClassLoader.findClass()
14:11:16.771 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.NoSuchMethodException): class java.lang.NoSuchMethodException
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Structure$AutoAllocated)
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Structure$AutoAllocated): calling childClassLoader.findClass()
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Structure$AutoAllocated)
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Structure$AutoAllocated): calling childClassLoader().findClass() 
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Structure$AutoAllocated): class com.sun.jna.Structure$AutoAllocated
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Structure$AutoAllocated): class com.sun.jna.Structure$AutoAllocated
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.ToNativeContext)
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.ToNativeContext): calling childClassLoader.findClass()
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.ToNativeContext)
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.ToNativeContext): calling childClassLoader().findClass() 
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.ToNativeContext): class com.sun.jna.ToNativeContext
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.ToNativeContext): class com.sun.jna.ToNativeContext
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.StructureWriteContext)
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.StructureWriteContext): calling childClassLoader.findClass()
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.StructureWriteContext)
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.StructureWriteContext): calling childClassLoader().findClass() 
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.StructureWriteContext): class com.sun.jna.StructureWriteContext
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.StructureWriteContext): class com.sun.jna.StructureWriteContext
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.ToNativeConverter)
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.ToNativeConverter): calling childClassLoader.findClass()
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.ToNativeConverter)
14:11:16.772 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.ToNativeConverter): calling childClassLoader().findClass() 
14:11:16.773 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.ToNativeConverter): interface com.sun.jna.ToNativeConverter
14:11:16.773 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.ToNativeConverter): interface com.sun.jna.ToNativeConverter
14:11:16.773 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.FromNativeConverter)
14:11:16.773 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.FromNativeConverter): calling childClassLoader.findClass()
14:11:16.773 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.FromNativeConverter)
14:11:16.773 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.FromNativeConverter): calling childClassLoader().findClass() 
14:11:16.773 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.FromNativeConverter): interface com.sun.jna.FromNativeConverter
14:11:16.773 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.FromNativeConverter): interface com.sun.jna.FromNativeConverter
14:11:16.773 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.StructureReadContext)
14:11:16.773 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.StructureReadContext): calling childClassLoader.findClass()
14:11:16.773 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.StructureReadContext)
14:11:16.773 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.StructureReadContext): calling childClassLoader().findClass() 
14:11:16.773 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.StructureReadContext): class com.sun.jna.StructureReadContext
14:11:16.773 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.StructureReadContext): class com.sun.jna.StructureReadContext
14:11:16.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Structure$ByValue)
14:11:16.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Structure$ByValue): calling childClassLoader.findClass()
14:11:16.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Structure$ByValue)
14:11:16.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Structure$ByValue): calling childClassLoader().findClass() 
14:11:16.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Structure$ByValue): interface com.sun.jna.Structure$ByValue
14:11:16.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Structure$ByValue): interface com.sun.jna.Structure$ByValue
14:11:16.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Callback)
14:11:16.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Callback): calling childClassLoader.findClass()
14:11:16.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Callback)
14:11:16.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Callback): calling childClassLoader().findClass() 
14:11:16.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Callback): interface com.sun.jna.Callback
14:11:16.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Callback): interface com.sun.jna.Callback
14:11:16.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.CallbackReference$AttachOptions)
14:11:16.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.CallbackReference$AttachOptions): calling childClassLoader.findClass()
14:11:16.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.CallbackReference$AttachOptions)
14:11:16.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.CallbackReference$AttachOptions): calling childClassLoader().findClass() 
14:11:16.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.CallbackReference$AttachOptions): class com.sun.jna.CallbackReference$AttachOptions
14:11:16.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.CallbackReference$AttachOptions): class com.sun.jna.CallbackReference$AttachOptions
14:11:16.774 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.CallbackReference)
14:11:16.775 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.CallbackReference): calling childClassLoader.findClass()
14:11:16.775 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.CallbackReference)
14:11:16.775 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.CallbackReference): calling childClassLoader().findClass() 
14:11:16.775 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.ref.WeakReference)
14:11:16.775 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.ref.WeakReference): calling childClassLoader.findClass()
14:11:16.775 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.ref.WeakReference): class java.lang.ref.WeakReference
14:11:16.776 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.CallbackReference): class com.sun.jna.CallbackReference
14:11:16.776 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.CallbackReference): class com.sun.jna.CallbackReference
14:11:16.776 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.CallbackProxy)
14:11:16.776 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.CallbackProxy): calling childClassLoader.findClass()
14:11:16.776 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.CallbackProxy)
14:11:16.776 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.CallbackProxy): calling childClassLoader().findClass() 
14:11:16.776 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.CallbackProxy): interface com.sun.jna.CallbackProxy
14:11:16.776 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.CallbackProxy): interface com.sun.jna.CallbackProxy
14:11:16.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.ref.Reference)
14:11:16.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.ref.Reference): calling childClassLoader.findClass()
14:11:16.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.ref.Reference): class java.lang.ref.Reference
14:11:16.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.WString)
14:11:16.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.WString): calling childClassLoader.findClass()
14:11:16.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.WString)
14:11:16.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.WString): calling childClassLoader().findClass() 
14:11:16.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Comparable)
14:11:16.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Comparable): calling childClassLoader.findClass()
14:11:16.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Comparable): interface java.lang.Comparable
14:11:16.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.WString): class com.sun.jna.WString
14:11:16.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.WString): class com.sun.jna.WString
14:11:16.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.NativeMapped)
14:11:16.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.NativeMapped): calling childClassLoader.findClass()
14:11:16.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.NativeMapped)
14:11:16.777 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.NativeMapped): calling childClassLoader().findClass() 
14:11:16.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.NativeMapped): interface com.sun.jna.NativeMapped
14:11:16.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.NativeMapped): interface com.sun.jna.NativeMapped
14:11:16.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.IntegerType)
14:11:16.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.IntegerType): calling childClassLoader.findClass()
14:11:16.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.IntegerType)
14:11:16.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.IntegerType): calling childClassLoader().findClass() 
14:11:16.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Number)
14:11:16.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Number): calling childClassLoader.findClass()
14:11:16.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Number): class java.lang.Number
14:11:16.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.IntegerType): class com.sun.jna.IntegerType
14:11:16.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.IntegerType): class com.sun.jna.IntegerType
14:11:16.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.PointerType)
14:11:16.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.PointerType): calling childClassLoader.findClass()
14:11:16.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.PointerType)
14:11:16.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.PointerType): calling childClassLoader().findClass() 
14:11:16.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.PointerType): class com.sun.jna.PointerType
14:11:16.778 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.PointerType): class com.sun.jna.PointerType
14:11:16.779 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.JNIEnv)
14:11:16.779 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.JNIEnv): calling childClassLoader.findClass()
14:11:16.779 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.JNIEnv)
14:11:16.779 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.JNIEnv): calling childClassLoader().findClass() 
14:11:16.779 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.JNIEnv): class com.sun.jna.JNIEnv
14:11:16.779 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.JNIEnv): class com.sun.jna.JNIEnv
14:11:16.779 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Native$ffi_callback)
14:11:16.779 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Native$ffi_callback): calling childClassLoader.findClass()
14:11:16.779 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Native$ffi_callback)
14:11:16.779 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Native$ffi_callback): calling childClassLoader().findClass() 
14:11:16.779 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Native$ffi_callback): interface com.sun.jna.Native$ffi_callback
14:11:16.779 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Native$ffi_callback): interface com.sun.jna.Native$ffi_callback
14:11:16.779 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Structure$FFIType$FFITypes)
14:11:16.779 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Structure$FFIType$FFITypes): calling childClassLoader.findClass()
14:11:16.779 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Structure$FFIType$FFITypes)
14:11:16.779 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Structure$FFIType$FFITypes): calling childClassLoader().findClass() 
14:11:16.780 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Structure$FFIType$FFITypes): class com.sun.jna.Structure$FFIType$FFITypes
14:11:16.780 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Structure$FFIType$FFITypes): class com.sun.jna.Structure$FFIType$FFITypes
14:11:16.780 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Native$2)
14:11:16.780 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Native$2): calling childClassLoader.findClass()
14:11:16.780 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Native$2)
14:11:16.780 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Native$2): calling childClassLoader().findClass() 
14:11:16.780 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Native$2): class com.sun.jna.Native$2
14:11:16.780 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Native$2): class com.sun.jna.Native$2
14:11:16.780 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Library$Handler)
14:11:16.780 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Library$Handler): calling childClassLoader.findClass()
14:11:16.780 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Library$Handler)
14:11:16.780 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Library$Handler): calling childClassLoader().findClass() 
14:11:16.781 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Library$Handler): class com.sun.jna.Library$Handler
14:11:16.781 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Library$Handler): class com.sun.jna.Library$Handler
14:11:16.781 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.AltCallingConvention)
14:11:16.781 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.AltCallingConvention): calling childClassLoader.findClass()
14:11:16.781 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.AltCallingConvention)
14:11:16.781 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.AltCallingConvention): calling childClassLoader().findClass() 
14:11:16.781 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.AltCallingConvention): interface com.sun.jna.AltCallingConvention
14:11:16.781 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.AltCallingConvention): interface com.sun.jna.AltCallingConvention
14:11:16.781 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.NativeLibrary)
14:11:16.781 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.NativeLibrary): calling childClassLoader.findClass()
14:11:16.781 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.NativeLibrary)
14:11:16.781 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.NativeLibrary): calling childClassLoader().findClass() 
14:11:16.782 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.NativeLibrary): class com.sun.jna.NativeLibrary
14:11:16.782 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.NativeLibrary): class com.sun.jna.NativeLibrary
14:11:16.783 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.InputStreamReader)
14:11:16.783 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.InputStreamReader): calling childClassLoader.findClass()
14:11:16.783 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.InputStreamReader): class java.io.InputStreamReader
14:11:16.783 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.LinkedHashSet)
14:11:16.783 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.LinkedHashSet): calling childClassLoader.findClass()
14:11:16.783 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.LinkedHashSet): class java.util.LinkedHashSet
14:11:16.783 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Runtime)
14:11:16.783 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Runtime): calling childClassLoader.findClass()
14:11:16.784 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Runtime): class java.lang.Runtime
14:11:16.786 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.BufferedReader)
14:11:16.786 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.BufferedReader): calling childClassLoader.findClass()
14:11:16.786 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.BufferedReader): class java.io.BufferedReader
14:11:16.786 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Process)
14:11:16.786 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Process): calling childClassLoader.findClass()
14:11:16.786 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Process): class java.lang.Process
14:11:16.788 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.StringTokenizer)
14:11:16.788 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.StringTokenizer): calling childClassLoader.findClass()
14:11:16.788 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.StringTokenizer): class java.util.StringTokenizer
14:11:16.788 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.reflect.Proxy)
14:11:16.788 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.reflect.Proxy): calling childClassLoader.findClass()
14:11:16.788 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.reflect.Proxy): class java.lang.reflect.Proxy
14:11:16.789 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.NoClassDefFoundError)
14:11:16.789 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.NoClassDefFoundError): calling childClassLoader.findClass()
14:11:16.789 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.NoClassDefFoundError): class java.lang.NoClassDefFoundError
14:11:16.789 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.reflect.UndeclaredThrowableException)
14:11:16.789 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.reflect.UndeclaredThrowableException): calling childClassLoader.findClass()
14:11:16.790 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.reflect.UndeclaredThrowableException): class java.lang.reflect.UndeclaredThrowableException
14:11:16.790 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.internal.ReflectionUtils)
14:11:16.790 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.internal.ReflectionUtils): calling childClassLoader.findClass()
14:11:16.790 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.internal.ReflectionUtils)
14:11:16.790 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.internal.ReflectionUtils): calling childClassLoader().findClass() 
14:11:16.790 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.internal.ReflectionUtils): class com.sun.jna.internal.ReflectionUtils
14:11:16.790 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.internal.ReflectionUtils): class com.sun.jna.internal.ReflectionUtils
14:11:16.790 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.AssertionError)
14:11:16.791 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.AssertionError): calling childClassLoader.findClass()
14:11:16.791 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.AssertionError): class java.lang.AssertionError
14:11:16.791 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.invoke.MethodHandles)
14:11:16.791 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.invoke.MethodHandles): calling childClassLoader.findClass()
14:11:16.791 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.invoke.MethodHandles): class java.lang.invoke.MethodHandles
14:11:16.791 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.invoke.MethodHandle)
14:11:16.791 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.invoke.MethodHandle): calling childClassLoader.findClass()
14:11:16.791 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.invoke.MethodHandle): class java.lang.invoke.MethodHandle
14:11:16.791 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.invoke.MethodHandles$Lookup)
14:11:16.791 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.invoke.MethodHandles$Lookup): calling childClassLoader.findClass()
14:11:16.791 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.invoke.MethodHandles$Lookup): class java.lang.invoke.MethodHandles$Lookup
14:11:16.791 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.invoke.MethodType)
14:11:16.791 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.invoke.MethodType): calling childClassLoader.findClass()
14:11:16.791 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.invoke.MethodType): class java.lang.invoke.MethodType
14:11:16.792 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Function)
14:11:16.792 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Function): calling childClassLoader.findClass()
14:11:16.792 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Function)
14:11:16.792 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Function): calling childClassLoader().findClass() 
14:11:16.793 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Function): class com.sun.jna.Function
14:11:16.793 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Function): class com.sun.jna.Function
14:11:16.793 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.ClassCastException)
14:11:16.793 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.ClassCastException): calling childClassLoader.findClass()
14:11:16.793 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.ClassCastException): class java.lang.ClassCastException
14:11:16.793 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.MethodParameterContext)
14:11:16.793 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.MethodParameterContext): calling childClassLoader.findClass()
14:11:16.793 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.MethodParameterContext)
14:11:16.793 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.MethodParameterContext): calling childClassLoader().findClass() 
14:11:16.794 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.FunctionParameterContext)
14:11:16.794 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.FunctionParameterContext): calling childClassLoader.findClass()
14:11:16.794 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.FunctionParameterContext)
14:11:16.794 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.FunctionParameterContext): calling childClassLoader().findClass() 
14:11:16.794 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.FunctionParameterContext): class com.sun.jna.FunctionParameterContext
14:11:16.794 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.FunctionParameterContext): class com.sun.jna.FunctionParameterContext
14:11:16.794 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.MethodParameterContext): class com.sun.jna.MethodParameterContext
14:11:16.794 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.MethodParameterContext): class com.sun.jna.MethodParameterContext
14:11:16.794 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.VarArgsChecker)
14:11:16.794 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.VarArgsChecker): calling childClassLoader.findClass()
14:11:16.794 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.VarArgsChecker)
14:11:16.794 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.VarArgsChecker): calling childClassLoader().findClass() 
14:11:16.794 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.VarArgsChecker): class com.sun.jna.VarArgsChecker
14:11:16.794 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.VarArgsChecker): class com.sun.jna.VarArgsChecker
14:11:16.794 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.VarArgsChecker$RealVarArgsChecker)
14:11:16.794 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.VarArgsChecker$RealVarArgsChecker): calling childClassLoader.findClass()
14:11:16.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.VarArgsChecker$RealVarArgsChecker)
14:11:16.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.VarArgsChecker$RealVarArgsChecker): calling childClassLoader().findClass() 
14:11:16.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.VarArgsChecker$RealVarArgsChecker): class com.sun.jna.VarArgsChecker$RealVarArgsChecker
14:11:16.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.VarArgsChecker$RealVarArgsChecker): class com.sun.jna.VarArgsChecker$RealVarArgsChecker
14:11:16.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.VarArgsChecker$NoVarArgsChecker)
14:11:16.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.VarArgsChecker$NoVarArgsChecker): calling childClassLoader.findClass()
14:11:16.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.VarArgsChecker$NoVarArgsChecker)
14:11:16.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.VarArgsChecker$NoVarArgsChecker): calling childClassLoader().findClass() 
14:11:16.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.VarArgsChecker$NoVarArgsChecker): class com.sun.jna.VarArgsChecker$NoVarArgsChecker
14:11:16.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.VarArgsChecker$NoVarArgsChecker): class com.sun.jna.VarArgsChecker$NoVarArgsChecker
14:11:16.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Library$Handler$FunctionInfo)
14:11:16.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Library$Handler$FunctionInfo): calling childClassLoader.findClass()
14:11:16.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Library$Handler$FunctionInfo)
14:11:16.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Library$Handler$FunctionInfo): calling childClassLoader().findClass() 
14:11:16.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Library$Handler$FunctionInfo): class com.sun.jna.Library$Handler$FunctionInfo
14:11:16.795 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Library$Handler$FunctionInfo): class com.sun.jna.Library$Handler$FunctionInfo
14:11:16.796 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jna.Function$PostCallRead)
14:11:16.796 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jna.Function$PostCallRead): calling childClassLoader.findClass()
14:11:16.796 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jna.Function$PostCallRead)
14:11:16.796 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jna.Function$PostCallRead): calling childClassLoader().findClass() 
14:11:16.796 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.sun.jna.Function$PostCallRead): interface com.sun.jna.Function$PostCallRead
14:11:16.796 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jna.Function$PostCallRead): interface com.sun.jna.Function$PostCallRead
14:11:16.796 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPluginCapability)
14:11:16.796 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPluginCapability): calling childClassLoader.findClass()
14:11:16.796 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPluginCapability)
14:11:16.796 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPluginCapability): calling childClassLoader().findClass() 
14:11:16.796 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPluginCapability): class org.apache.ranger.plugin.util.RangerPluginCapability
14:11:16.796 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPluginCapability): class org.apache.ranger.plugin.util.RangerPluginCapability
14:11:16.797 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPluginCapability$RangerPluginFeature)
14:11:16.797 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPluginCapability$RangerPluginFeature): calling childClassLoader.findClass()
14:11:16.797 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPluginCapability$RangerPluginFeature)
14:11:16.797 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPluginCapability$RangerPluginFeature): calling childClassLoader().findClass() 
14:11:16.797 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Enum)
14:11:16.797 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Enum): calling childClassLoader.findClass()
14:11:16.797 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Enum): class java.lang.Enum
14:11:16.797 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPluginCapability$RangerPluginFeature): class org.apache.ranger.plugin.util.RangerPluginCapability$RangerPluginFeature
14:11:16.797 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPluginCapability$RangerPluginFeature): class org.apache.ranger.plugin.util.RangerPluginCapability$RangerPluginFeature
14:11:16.798 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.GsonBuilder)
14:11:16.798 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.GsonBuilder): calling childClassLoader.findClass()
14:11:16.798 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.GsonBuilder)
14:11:16.798 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.GsonBuilder): calling childClassLoader().findClass() 
14:11:16.799 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.GsonBuilder): class com.google.gson.GsonBuilder
14:11:16.799 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.GsonBuilder): class com.google.gson.GsonBuilder
14:11:16.799 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingStrategy)
14:11:16.799 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingStrategy): calling childClassLoader.findClass()
14:11:16.799 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.FieldNamingStrategy)
14:11:16.799 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.FieldNamingStrategy): calling childClassLoader().findClass() 
14:11:16.799 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.FieldNamingStrategy): interface com.google.gson.FieldNamingStrategy
14:11:16.799 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingStrategy): interface com.google.gson.FieldNamingStrategy
14:11:16.799 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.Excluder)
14:11:16.799 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.Excluder): calling childClassLoader.findClass()
14:11:16.799 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.Excluder)
14:11:16.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.Excluder): calling childClassLoader().findClass() 
14:11:16.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.TypeAdapterFactory)
14:11:16.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.TypeAdapterFactory): calling childClassLoader.findClass()
14:11:16.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.TypeAdapterFactory)
14:11:16.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.TypeAdapterFactory): calling childClassLoader().findClass() 
14:11:16.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.TypeAdapterFactory): interface com.google.gson.TypeAdapterFactory
14:11:16.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.TypeAdapterFactory): interface com.google.gson.TypeAdapterFactory
14:11:16.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Cloneable)
14:11:16.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Cloneable): calling childClassLoader.findClass()
14:11:16.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Cloneable): interface java.lang.Cloneable
14:11:16.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.Excluder): class com.google.gson.internal.Excluder
14:11:16.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.Excluder): class com.google.gson.internal.Excluder
14:11:16.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.CloneNotSupportedException)
14:11:16.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.CloneNotSupportedException): calling childClassLoader.findClass()
14:11:16.800 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.CloneNotSupportedException): class java.lang.CloneNotSupportedException
14:11:16.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.TypeAdapter)
14:11:16.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.TypeAdapter): calling childClassLoader.findClass()
14:11:16.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.TypeAdapter)
14:11:16.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.TypeAdapter): calling childClassLoader().findClass() 
14:11:16.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.TypeAdapter): class com.google.gson.TypeAdapter
14:11:16.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.TypeAdapter): class com.google.gson.TypeAdapter
14:11:16.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.Excluder$1)
14:11:16.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.Excluder$1): calling childClassLoader.findClass()
14:11:16.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.Excluder$1)
14:11:16.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.Excluder$1): calling childClassLoader().findClass() 
14:11:16.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.Excluder$1): class com.google.gson.internal.Excluder$1
14:11:16.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.Excluder$1): class com.google.gson.internal.Excluder$1
14:11:16.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.LongSerializationPolicy)
14:11:16.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.LongSerializationPolicy): calling childClassLoader.findClass()
14:11:16.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.LongSerializationPolicy)
14:11:16.801 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.LongSerializationPolicy): calling childClassLoader().findClass() 
14:11:16.802 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.LongSerializationPolicy): class com.google.gson.LongSerializationPolicy
14:11:16.802 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.LongSerializationPolicy): class com.google.gson.LongSerializationPolicy
14:11:16.802 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.LongSerializationPolicy$1)
14:11:16.802 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.LongSerializationPolicy$1): calling childClassLoader.findClass()
14:11:16.802 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.LongSerializationPolicy$1)
14:11:16.802 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.LongSerializationPolicy$1): calling childClassLoader().findClass() 
14:11:16.802 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.LongSerializationPolicy$1): class com.google.gson.LongSerializationPolicy$1
14:11:16.802 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.LongSerializationPolicy$1): class com.google.gson.LongSerializationPolicy$1
14:11:16.802 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.LongSerializationPolicy$2)
14:11:16.802 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.LongSerializationPolicy$2): calling childClassLoader.findClass()
14:11:16.802 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.LongSerializationPolicy$2)
14:11:16.802 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.LongSerializationPolicy$2): calling childClassLoader().findClass() 
14:11:16.802 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.LongSerializationPolicy$2): class com.google.gson.LongSerializationPolicy$2
14:11:16.802 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.LongSerializationPolicy$2): class com.google.gson.LongSerializationPolicy$2
14:11:16.802 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.JsonElement)
14:11:16.802 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.JsonElement): calling childClassLoader.findClass()
14:11:16.802 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.JsonElement)
14:11:16.802 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.JsonElement): calling childClassLoader().findClass() 
14:11:16.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.JsonElement): class com.google.gson.JsonElement
14:11:16.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.JsonElement): class com.google.gson.JsonElement
14:11:16.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.JsonNull)
14:11:16.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.JsonNull): calling childClassLoader.findClass()
14:11:16.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.JsonNull)
14:11:16.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.JsonNull): calling childClassLoader().findClass() 
14:11:16.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.JsonNull): class com.google.gson.JsonNull
14:11:16.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.JsonNull): class com.google.gson.JsonNull
14:11:16.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.JsonPrimitive)
14:11:16.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.JsonPrimitive): calling childClassLoader.findClass()
14:11:16.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.JsonPrimitive)
14:11:16.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.JsonPrimitive): calling childClassLoader().findClass() 
14:11:16.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.JsonPrimitive): class com.google.gson.JsonPrimitive
14:11:16.803 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.JsonPrimitive): class com.google.gson.JsonPrimitive
14:11:16.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy)
14:11:16.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy): calling childClassLoader.findClass()
14:11:16.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy)
14:11:16.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy): calling childClassLoader().findClass() 
14:11:16.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy): class com.google.gson.FieldNamingPolicy
14:11:16.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy): class com.google.gson.FieldNamingPolicy
14:11:16.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$1)
14:11:16.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$1): calling childClassLoader.findClass()
14:11:16.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$1)
14:11:16.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$1): calling childClassLoader().findClass() 
14:11:16.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$1): class com.google.gson.FieldNamingPolicy$1
14:11:16.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$1): class com.google.gson.FieldNamingPolicy$1
14:11:16.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$2)
14:11:16.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$2): calling childClassLoader.findClass()
14:11:16.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$2)
14:11:16.804 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$2): calling childClassLoader().findClass() 
14:11:16.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$2): class com.google.gson.FieldNamingPolicy$2
14:11:16.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$2): class com.google.gson.FieldNamingPolicy$2
14:11:16.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$3)
14:11:16.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$3): calling childClassLoader.findClass()
14:11:16.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$3)
14:11:16.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$3): calling childClassLoader().findClass() 
14:11:16.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$3): class com.google.gson.FieldNamingPolicy$3
14:11:16.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$3): class com.google.gson.FieldNamingPolicy$3
14:11:16.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$4)
14:11:16.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$4): calling childClassLoader.findClass()
14:11:16.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$4)
14:11:16.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$4): calling childClassLoader().findClass() 
14:11:16.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$4): class com.google.gson.FieldNamingPolicy$4
14:11:16.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$4): class com.google.gson.FieldNamingPolicy$4
14:11:16.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$5)
14:11:16.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$5): calling childClassLoader.findClass()
14:11:16.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$5)
14:11:16.805 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$5): calling childClassLoader().findClass() 
14:11:16.806 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$5): class com.google.gson.FieldNamingPolicy$5
14:11:16.806 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$5): class com.google.gson.FieldNamingPolicy$5
14:11:16.806 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$6)
14:11:16.806 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$6): calling childClassLoader.findClass()
14:11:16.806 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$6)
14:11:16.806 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$6): calling childClassLoader().findClass() 
14:11:16.806 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$6): class com.google.gson.FieldNamingPolicy$6
14:11:16.806 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$6): class com.google.gson.FieldNamingPolicy$6
14:11:16.806 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$7)
14:11:16.806 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$7): calling childClassLoader.findClass()
14:11:16.806 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$7)
14:11:16.806 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$7): calling childClassLoader().findClass() 
14:11:16.806 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.FieldNamingPolicy$7): class com.google.gson.FieldNamingPolicy$7
14:11:16.806 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.FieldNamingPolicy$7): class com.google.gson.FieldNamingPolicy$7
14:11:16.806 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.Gson)
14:11:16.806 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.Gson): calling childClassLoader.findClass()
14:11:16.806 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.Gson)
14:11:16.806 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.Gson): calling childClassLoader().findClass() 
14:11:16.807 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.Gson): class com.google.gson.Gson
14:11:16.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.Gson): class com.google.gson.Gson
14:11:16.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.ToNumberStrategy)
14:11:16.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.ToNumberStrategy): calling childClassLoader.findClass()
14:11:16.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.ToNumberStrategy)
14:11:16.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.ToNumberStrategy): calling childClassLoader().findClass() 
14:11:16.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.ToNumberStrategy): interface com.google.gson.ToNumberStrategy
14:11:16.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.ToNumberStrategy): interface com.google.gson.ToNumberStrategy
14:11:16.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.Gson$FutureTypeAdapter)
14:11:16.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.Gson$FutureTypeAdapter): calling childClassLoader.findClass()
14:11:16.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.Gson$FutureTypeAdapter)
14:11:16.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.Gson$FutureTypeAdapter): calling childClassLoader().findClass() 
14:11:16.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.Gson$FutureTypeAdapter): class com.google.gson.Gson$FutureTypeAdapter
14:11:16.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.Gson$FutureTypeAdapter): class com.google.gson.Gson$FutureTypeAdapter
14:11:16.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.reflect.Type)
14:11:16.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.reflect.Type): calling childClassLoader.findClass()
14:11:16.808 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.reflect.Type): interface java.lang.reflect.Type
14:11:16.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.EOFException)
14:11:16.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.EOFException): calling childClassLoader.findClass()
14:11:16.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.EOFException): class java.io.EOFException
14:11:16.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.JsonSyntaxException)
14:11:16.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.JsonSyntaxException): calling childClassLoader.findClass()
14:11:16.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.JsonSyntaxException)
14:11:16.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.JsonSyntaxException): calling childClassLoader().findClass() 
14:11:16.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.JsonParseException)
14:11:16.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.JsonParseException): calling childClassLoader.findClass()
14:11:16.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.JsonParseException)
14:11:16.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.JsonParseException): calling childClassLoader().findClass() 
14:11:16.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.JsonParseException): class com.google.gson.JsonParseException
14:11:16.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.JsonParseException): class com.google.gson.JsonParseException
14:11:16.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.JsonSyntaxException): class com.google.gson.JsonSyntaxException
14:11:16.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.JsonSyntaxException): class com.google.gson.JsonSyntaxException
14:11:16.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.stream.JsonReader)
14:11:16.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.stream.JsonReader): calling childClassLoader.findClass()
14:11:16.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.stream.JsonReader)
14:11:16.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.stream.JsonReader): calling childClassLoader().findClass() 
14:11:16.810 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.Closeable)
14:11:16.810 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.Closeable): calling childClassLoader.findClass()
14:11:16.810 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.Closeable): interface java.io.Closeable
14:11:16.810 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.stream.JsonReader): class com.google.gson.stream.JsonReader
14:11:16.810 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.stream.JsonReader): class com.google.gson.stream.JsonReader
14:11:16.810 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.JsonTreeReader)
14:11:16.810 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.JsonTreeReader): calling childClassLoader.findClass()
14:11:16.810 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.JsonTreeReader)
14:11:16.810 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.JsonTreeReader): calling childClassLoader().findClass() 
14:11:16.811 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.JsonTreeReader): class com.google.gson.internal.bind.JsonTreeReader
14:11:16.811 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.JsonTreeReader): class com.google.gson.internal.bind.JsonTreeReader
14:11:16.811 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.StringReader)
14:11:16.811 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.StringReader): calling childClassLoader.findClass()
14:11:16.811 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.StringReader): class java.io.StringReader
14:11:16.811 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.JsonIOException)
14:11:16.811 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.JsonIOException): calling childClassLoader.findClass()
14:11:16.811 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.JsonIOException)
14:11:16.811 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.JsonIOException): calling childClassLoader().findClass() 
14:11:16.811 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.JsonIOException): class com.google.gson.JsonIOException
14:11:16.811 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.JsonIOException): class com.google.gson.JsonIOException
14:11:16.811 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Appendable)
14:11:16.811 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Appendable): calling childClassLoader.findClass()
14:11:16.811 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Appendable): interface java.lang.Appendable
14:11:16.811 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.stream.JsonWriter)
14:11:16.811 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.stream.JsonWriter): calling childClassLoader.findClass()
14:11:16.811 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.stream.JsonWriter)
14:11:16.811 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.stream.JsonWriter): calling childClassLoader().findClass() 
14:11:16.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.Flushable)
14:11:16.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.Flushable): calling childClassLoader.findClass()
14:11:16.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.Flushable): interface java.io.Flushable
14:11:16.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.stream.JsonWriter): class com.google.gson.stream.JsonWriter
14:11:16.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.stream.JsonWriter): class com.google.gson.stream.JsonWriter
14:11:16.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.JsonTreeWriter)
14:11:16.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.JsonTreeWriter): calling childClassLoader.findClass()
14:11:16.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.JsonTreeWriter)
14:11:16.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.JsonTreeWriter): calling childClassLoader().findClass() 
14:11:16.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.JsonTreeWriter): class com.google.gson.internal.bind.JsonTreeWriter
14:11:16.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.JsonTreeWriter): class com.google.gson.internal.bind.JsonTreeWriter
14:11:16.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.Gson$3)
14:11:16.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.Gson$3): calling childClassLoader.findClass()
14:11:16.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.Gson$3)
14:11:16.812 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.Gson$3): calling childClassLoader().findClass() 
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.Gson$3): class com.google.gson.Gson$3
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.Gson$3): class com.google.gson.Gson$3
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.Gson$1)
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.Gson$1): calling childClassLoader.findClass()
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.Gson$1)
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.Gson$1): calling childClassLoader().findClass() 
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.Gson$1): class com.google.gson.Gson$1
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.Gson$1): class com.google.gson.Gson$1
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.Gson$2)
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.Gson$2): calling childClassLoader.findClass()
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.Gson$2)
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.Gson$2): calling childClassLoader().findClass() 
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.Gson$2): class com.google.gson.Gson$2
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.Gson$2): class com.google.gson.Gson$2
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.stream.MalformedJsonException)
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.stream.MalformedJsonException): calling childClassLoader.findClass()
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.stream.MalformedJsonException)
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.stream.MalformedJsonException): calling childClassLoader().findClass() 
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.stream.MalformedJsonException): class com.google.gson.stream.MalformedJsonException
14:11:16.813 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.stream.MalformedJsonException): class com.google.gson.stream.MalformedJsonException
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy)
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy): calling childClassLoader.findClass()
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy)
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy): calling childClassLoader().findClass() 
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy): class com.google.gson.ToNumberPolicy
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy): class com.google.gson.ToNumberPolicy
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$1)
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$1): calling childClassLoader.findClass()
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$1)
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$1): calling childClassLoader().findClass() 
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$1): class com.google.gson.ToNumberPolicy$1
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$1): class com.google.gson.ToNumberPolicy$1
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$2)
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$2): calling childClassLoader.findClass()
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$2)
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$2): calling childClassLoader().findClass() 
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$2): class com.google.gson.ToNumberPolicy$2
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$2): class com.google.gson.ToNumberPolicy$2
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$3)
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$3): calling childClassLoader.findClass()
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$3)
14:11:16.814 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$3): calling childClassLoader().findClass() 
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$3): class com.google.gson.ToNumberPolicy$3
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$3): class com.google.gson.ToNumberPolicy$3
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$4)
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$4): calling childClassLoader.findClass()
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$4)
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$4): calling childClassLoader().findClass() 
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.ToNumberPolicy$4): class com.google.gson.ToNumberPolicy$4
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.ToNumberPolicy$4): class com.google.gson.ToNumberPolicy$4
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.LazilyParsedNumber)
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.LazilyParsedNumber): calling childClassLoader.findClass()
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.LazilyParsedNumber)
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.LazilyParsedNumber): calling childClassLoader().findClass() 
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.LazilyParsedNumber): class com.google.gson.internal.LazilyParsedNumber
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.LazilyParsedNumber): class com.google.gson.internal.LazilyParsedNumber
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.math.BigDecimal)
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.math.BigDecimal): calling childClassLoader.findClass()
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.math.BigDecimal): class java.math.BigDecimal
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.reflect.TypeToken)
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.reflect.TypeToken): calling childClassLoader.findClass()
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.reflect.TypeToken)
14:11:16.815 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.reflect.TypeToken): calling childClassLoader().findClass() 
14:11:16.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.reflect.TypeToken): class com.google.gson.reflect.TypeToken
14:11:16.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.reflect.TypeToken): class com.google.gson.reflect.TypeToken
14:11:16.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.$Gson$Preconditions)
14:11:16.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.$Gson$Preconditions): calling childClassLoader.findClass()
14:11:16.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.$Gson$Preconditions)
14:11:16.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.$Gson$Preconditions): calling childClassLoader().findClass() 
14:11:16.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.$Gson$Preconditions): class com.google.gson.internal.$Gson$Preconditions
14:11:16.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.$Gson$Preconditions): class com.google.gson.internal.$Gson$Preconditions
14:11:16.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.$Gson$Types)
14:11:16.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.$Gson$Types): calling childClassLoader.findClass()
14:11:16.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.$Gson$Types)
14:11:16.816 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.$Gson$Types): calling childClassLoader().findClass() 
14:11:16.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.$Gson$Types): class com.google.gson.internal.$Gson$Types
14:11:16.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.$Gson$Types): class com.google.gson.internal.$Gson$Types
14:11:16.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.NoSuchElementException)
14:11:16.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.NoSuchElementException): calling childClassLoader.findClass()
14:11:16.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.NoSuchElementException): class java.util.NoSuchElementException
14:11:16.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.reflect.ParameterizedType)
14:11:16.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.reflect.ParameterizedType): calling childClassLoader.findClass()
14:11:16.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.reflect.ParameterizedType): interface java.lang.reflect.ParameterizedType
14:11:16.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.reflect.GenericArrayType)
14:11:16.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.reflect.GenericArrayType): calling childClassLoader.findClass()
14:11:16.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.reflect.GenericArrayType): interface java.lang.reflect.GenericArrayType
14:11:16.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.reflect.WildcardType)
14:11:16.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.reflect.WildcardType): calling childClassLoader.findClass()
14:11:16.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.reflect.WildcardType): interface java.lang.reflect.WildcardType
14:11:16.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTypesSupport)
14:11:16.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTypesSupport): calling childClassLoader.findClass()
14:11:16.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTypesSupport)
14:11:16.817 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTypesSupport): calling childClassLoader().findClass() 
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTypesSupport): class com.google.gson.internal.sql.SqlTypesSupport
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTypesSupport): class com.google.gson.internal.sql.SqlTypesSupport
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType)
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType): calling childClassLoader.findClass()
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType)
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType): calling childClassLoader().findClass() 
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType): class com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType): class com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTypesSupport$1)
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTypesSupport$1): calling childClassLoader.findClass()
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTypesSupport$1)
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTypesSupport$1): calling childClassLoader().findClass() 
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTypesSupport$1): class com.google.gson.internal.sql.SqlTypesSupport$1
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTypesSupport$1): class com.google.gson.internal.sql.SqlTypesSupport$1
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTypesSupport$2)
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTypesSupport$2): calling childClassLoader.findClass()
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTypesSupport$2)
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTypesSupport$2): calling childClassLoader().findClass() 
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTypesSupport$2): class com.google.gson.internal.sql.SqlTypesSupport$2
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTypesSupport$2): class com.google.gson.internal.sql.SqlTypesSupport$2
14:11:16.818 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.sql.Date)
14:11:16.819 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.sql.Date): calling childClassLoader.findClass()
14:11:16.819 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(java.sql.Date)
14:11:16.819 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(java.sql.Date): calling childClassLoader().findClass() 
14:11:16.819 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(java.sql.Date): calling componentClassLoader.findClass()
14:11:16.819 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.sql.Date): calling componentClassLoader.loadClass()
14:11:16.820 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.sql.Date): class java.sql.Date
14:11:16.820 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType$1)
14:11:16.820 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType$1): calling childClassLoader.findClass()
14:11:16.820 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType$1)
14:11:16.820 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType$1): calling childClassLoader().findClass() 
14:11:16.820 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType$1): class com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType$1
14:11:16.820 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType$1): class com.google.gson.internal.bind.DefaultDateTypeAdapter$DateType$1
14:11:16.820 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DefaultDateTypeAdapter)
14:11:16.820 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DefaultDateTypeAdapter): calling childClassLoader.findClass()
14:11:16.820 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DefaultDateTypeAdapter)
14:11:16.820 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DefaultDateTypeAdapter): calling childClassLoader().findClass() 
14:11:16.821 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DefaultDateTypeAdapter): class com.google.gson.internal.bind.DefaultDateTypeAdapter
14:11:16.821 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DefaultDateTypeAdapter): class com.google.gson.internal.bind.DefaultDateTypeAdapter
14:11:16.821 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Date)
14:11:16.821 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Date): calling childClassLoader.findClass()
14:11:16.821 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Date): class java.util.Date
14:11:16.821 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.sql.Timestamp)
14:11:16.821 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.sql.Timestamp): calling childClassLoader.findClass()
14:11:16.821 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(java.sql.Timestamp)
14:11:16.821 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(java.sql.Timestamp): calling childClassLoader().findClass() 
14:11:16.821 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(java.sql.Timestamp): calling componentClassLoader.findClass()
14:11:16.821 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.sql.Timestamp): calling componentClassLoader.loadClass()
14:11:16.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.sql.Timestamp): class java.sql.Timestamp
14:11:16.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlDateTypeAdapter)
14:11:16.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlDateTypeAdapter): calling childClassLoader.findClass()
14:11:16.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlDateTypeAdapter)
14:11:16.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlDateTypeAdapter): calling childClassLoader().findClass() 
14:11:16.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlDateTypeAdapter): class com.google.gson.internal.sql.SqlDateTypeAdapter
14:11:16.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlDateTypeAdapter): class com.google.gson.internal.sql.SqlDateTypeAdapter
14:11:16.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.TypeAdapter$1)
14:11:16.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.TypeAdapter$1): calling childClassLoader.findClass()
14:11:16.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.TypeAdapter$1)
14:11:16.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.TypeAdapter$1): calling childClassLoader().findClass() 
14:11:16.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.TypeAdapter$1): class com.google.gson.TypeAdapter$1
14:11:16.822 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.TypeAdapter$1): class com.google.gson.TypeAdapter$1
14:11:16.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlDateTypeAdapter$1)
14:11:16.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlDateTypeAdapter$1): calling childClassLoader.findClass()
14:11:16.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlDateTypeAdapter$1)
14:11:16.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlDateTypeAdapter$1): calling childClassLoader().findClass() 
14:11:16.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlDateTypeAdapter$1): class com.google.gson.internal.sql.SqlDateTypeAdapter$1
14:11:16.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlDateTypeAdapter$1): class com.google.gson.internal.sql.SqlDateTypeAdapter$1
14:11:16.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimeTypeAdapter)
14:11:16.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimeTypeAdapter): calling childClassLoader.findClass()
14:11:16.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimeTypeAdapter)
14:11:16.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimeTypeAdapter): calling childClassLoader().findClass() 
14:11:16.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimeTypeAdapter): class com.google.gson.internal.sql.SqlTimeTypeAdapter
14:11:16.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimeTypeAdapter): class com.google.gson.internal.sql.SqlTimeTypeAdapter
14:11:16.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.sql.Time)
14:11:16.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.sql.Time): calling childClassLoader.findClass()
14:11:16.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(java.sql.Time)
14:11:16.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(java.sql.Time): calling childClassLoader().findClass() 
14:11:16.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(java.sql.Time): calling componentClassLoader.findClass()
14:11:16.823 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.sql.Time): calling componentClassLoader.loadClass()
14:11:16.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.sql.Time): class java.sql.Time
14:11:16.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimeTypeAdapter$1)
14:11:16.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimeTypeAdapter$1): calling childClassLoader.findClass()
14:11:16.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimeTypeAdapter$1)
14:11:16.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimeTypeAdapter$1): calling childClassLoader().findClass() 
14:11:16.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimeTypeAdapter$1): class com.google.gson.internal.sql.SqlTimeTypeAdapter$1
14:11:16.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimeTypeAdapter$1): class com.google.gson.internal.sql.SqlTimeTypeAdapter$1
14:11:16.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter)
14:11:16.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter): calling childClassLoader.findClass()
14:11:16.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter)
14:11:16.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter): calling childClassLoader().findClass() 
14:11:16.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter): class com.google.gson.internal.sql.SqlTimestampTypeAdapter
14:11:16.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter): class com.google.gson.internal.sql.SqlTimestampTypeAdapter
14:11:16.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter$1)
14:11:16.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter$1): calling childClassLoader.findClass()
14:11:16.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter$1)
14:11:16.824 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter$1): calling childClassLoader().findClass() 
14:11:16.825 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter$1): class com.google.gson.internal.sql.SqlTimestampTypeAdapter$1
14:11:16.825 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.sql.SqlTimestampTypeAdapter$1): class com.google.gson.internal.sql.SqlTimestampTypeAdapter$1
14:11:16.825 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Locale)
14:11:16.825 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Locale): calling childClassLoader.findClass()
14:11:16.825 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Locale): class java.util.Locale
14:11:16.825 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters)
14:11:16.825 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters): calling childClassLoader.findClass()
14:11:16.825 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters)
14:11:16.825 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters): calling childClassLoader().findClass() 
14:11:16.825 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters): class com.google.gson.internal.bind.TypeAdapters
14:11:16.825 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters): class com.google.gson.internal.bind.TypeAdapters
14:11:16.826 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$3)
14:11:16.826 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$3): calling childClassLoader.findClass()
14:11:16.826 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$3)
14:11:16.826 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$3): calling childClassLoader().findClass() 
14:11:16.826 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$3): class com.google.gson.internal.bind.TypeAdapters$3
14:11:16.826 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$3): class com.google.gson.internal.bind.TypeAdapters$3
14:11:16.826 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$4)
14:11:16.826 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$4): calling childClassLoader.findClass()
14:11:16.826 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$4)
14:11:16.826 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$4): calling childClassLoader().findClass() 
14:11:16.826 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$4): class com.google.gson.internal.bind.TypeAdapters$4
14:11:16.826 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$4): class com.google.gson.internal.bind.TypeAdapters$4
14:11:16.826 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$5)
14:11:16.826 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$5): calling childClassLoader.findClass()
14:11:16.826 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$5)
14:11:16.826 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$5): calling childClassLoader().findClass() 
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$5): class com.google.gson.internal.bind.TypeAdapters$5
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$5): class com.google.gson.internal.bind.TypeAdapters$5
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$6)
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$6): calling childClassLoader.findClass()
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$6)
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$6): calling childClassLoader().findClass() 
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$6): class com.google.gson.internal.bind.TypeAdapters$6
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$6): class com.google.gson.internal.bind.TypeAdapters$6
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$7)
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$7): calling childClassLoader.findClass()
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$7)
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$7): calling childClassLoader().findClass() 
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$7): class com.google.gson.internal.bind.TypeAdapters$7
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$7): class com.google.gson.internal.bind.TypeAdapters$7
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$11)
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$11): calling childClassLoader.findClass()
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$11)
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$11): calling childClassLoader().findClass() 
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$11): class com.google.gson.internal.bind.TypeAdapters$11
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$11): class com.google.gson.internal.bind.TypeAdapters$11
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$12)
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$12): calling childClassLoader.findClass()
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$12)
14:11:16.827 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$12): calling childClassLoader().findClass() 
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$12): class com.google.gson.internal.bind.TypeAdapters$12
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$12): class com.google.gson.internal.bind.TypeAdapters$12
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$13)
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$13): calling childClassLoader.findClass()
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$13)
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$13): calling childClassLoader().findClass() 
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$13): class com.google.gson.internal.bind.TypeAdapters$13
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$13): class com.google.gson.internal.bind.TypeAdapters$13
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$14)
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$14): calling childClassLoader.findClass()
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$14)
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$14): calling childClassLoader().findClass() 
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$14): class com.google.gson.internal.bind.TypeAdapters$14
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$14): class com.google.gson.internal.bind.TypeAdapters$14
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$15)
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$15): calling childClassLoader.findClass()
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$15)
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$15): calling childClassLoader().findClass() 
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$15): class com.google.gson.internal.bind.TypeAdapters$15
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$15): class com.google.gson.internal.bind.TypeAdapters$15
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$16)
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$16): calling childClassLoader.findClass()
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$16)
14:11:16.828 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$16): calling childClassLoader().findClass() 
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$16): class com.google.gson.internal.bind.TypeAdapters$16
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$16): class com.google.gson.internal.bind.TypeAdapters$16
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$17)
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$17): calling childClassLoader.findClass()
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$17)
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$17): calling childClassLoader().findClass() 
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$17): class com.google.gson.internal.bind.TypeAdapters$17
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$17): class com.google.gson.internal.bind.TypeAdapters$17
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$18)
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$18): calling childClassLoader.findClass()
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$18)
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$18): calling childClassLoader().findClass() 
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$18): class com.google.gson.internal.bind.TypeAdapters$18
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$18): class com.google.gson.internal.bind.TypeAdapters$18
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$19)
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$19): calling childClassLoader.findClass()
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$19)
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$19): calling childClassLoader().findClass() 
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$19): class com.google.gson.internal.bind.TypeAdapters$19
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$19): class com.google.gson.internal.bind.TypeAdapters$19
14:11:16.829 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$20)
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$20): calling childClassLoader.findClass()
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$20)
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$20): calling childClassLoader().findClass() 
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$20): class com.google.gson.internal.bind.TypeAdapters$20
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$20): class com.google.gson.internal.bind.TypeAdapters$20
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$21)
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$21): calling childClassLoader.findClass()
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$21)
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$21): calling childClassLoader().findClass() 
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$21): class com.google.gson.internal.bind.TypeAdapters$21
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$21): class com.google.gson.internal.bind.TypeAdapters$21
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$22)
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$22): calling childClassLoader.findClass()
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$22)
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$22): calling childClassLoader().findClass() 
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$22): class com.google.gson.internal.bind.TypeAdapters$22
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$22): class com.google.gson.internal.bind.TypeAdapters$22
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$23)
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$23): calling childClassLoader.findClass()
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$23)
14:11:16.830 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$23): calling childClassLoader().findClass() 
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$23): class com.google.gson.internal.bind.TypeAdapters$23
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$23): class com.google.gson.internal.bind.TypeAdapters$23
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$24)
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$24): calling childClassLoader.findClass()
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$24)
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$24): calling childClassLoader().findClass() 
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$24): class com.google.gson.internal.bind.TypeAdapters$24
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$24): class com.google.gson.internal.bind.TypeAdapters$24
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$26)
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$26): calling childClassLoader.findClass()
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$26)
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$26): calling childClassLoader().findClass() 
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$26): class com.google.gson.internal.bind.TypeAdapters$26
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$26): class com.google.gson.internal.bind.TypeAdapters$26
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$27)
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$27): calling childClassLoader.findClass()
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$27)
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$27): calling childClassLoader().findClass() 
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$27): class com.google.gson.internal.bind.TypeAdapters$27
14:11:16.831 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$27): class com.google.gson.internal.bind.TypeAdapters$27
14:11:16.832 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$28)
14:11:16.832 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$28): calling childClassLoader.findClass()
14:11:16.832 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$28)
14:11:16.832 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$28): calling childClassLoader().findClass() 
14:11:16.832 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$28): class com.google.gson.internal.bind.TypeAdapters$28
14:11:16.832 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$28): class com.google.gson.internal.bind.TypeAdapters$28
14:11:16.832 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$1)
14:11:16.832 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$1): calling childClassLoader.findClass()
14:11:16.832 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$1)
14:11:16.832 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$1): calling childClassLoader().findClass() 
14:11:16.832 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$1): class com.google.gson.internal.bind.TypeAdapters$1
14:11:16.832 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$1): class com.google.gson.internal.bind.TypeAdapters$1
14:11:16.832 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$31)
14:11:16.832 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$31): calling childClassLoader.findClass()
14:11:16.832 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$31)
14:11:16.832 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$31): calling childClassLoader().findClass() 
14:11:16.833 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$31): class com.google.gson.internal.bind.TypeAdapters$31
14:11:16.833 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$31): class com.google.gson.internal.bind.TypeAdapters$31
14:11:16.833 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$2)
14:11:16.833 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$2): calling childClassLoader.findClass()
14:11:16.833 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$2)
14:11:16.833 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$2): calling childClassLoader().findClass() 
14:11:16.833 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$2): class com.google.gson.internal.bind.TypeAdapters$2
14:11:16.833 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$2): class com.google.gson.internal.bind.TypeAdapters$2
14:11:16.833 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.BitSet)
14:11:16.833 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.BitSet): calling childClassLoader.findClass()
14:11:16.833 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.BitSet): class java.util.BitSet
14:11:16.833 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$32)
14:11:16.833 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$32): calling childClassLoader.findClass()
14:11:16.833 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$32)
14:11:16.833 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$32): calling childClassLoader().findClass() 
14:11:16.833 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$32): class com.google.gson.internal.bind.TypeAdapters$32
14:11:16.834 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$32): class com.google.gson.internal.bind.TypeAdapters$32
14:11:16.834 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$8)
14:11:16.834 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$8): calling childClassLoader.findClass()
14:11:16.834 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$8)
14:11:16.834 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$8): calling childClassLoader().findClass() 
14:11:16.834 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$8): class com.google.gson.internal.bind.TypeAdapters$8
14:11:16.834 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$8): class com.google.gson.internal.bind.TypeAdapters$8
14:11:16.834 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicInteger)
14:11:16.834 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicInteger): calling childClassLoader.findClass()
14:11:16.834 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicInteger): class java.util.concurrent.atomic.AtomicInteger
14:11:16.834 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$9)
14:11:16.834 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$9): calling childClassLoader.findClass()
14:11:16.834 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$9)
14:11:16.834 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$9): calling childClassLoader().findClass() 
14:11:16.834 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$9): class com.google.gson.internal.bind.TypeAdapters$9
14:11:16.834 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$9): class com.google.gson.internal.bind.TypeAdapters$9
14:11:16.835 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$10)
14:11:16.835 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$10): calling childClassLoader.findClass()
14:11:16.835 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$10)
14:11:16.835 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$10): calling childClassLoader().findClass() 
14:11:16.835 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$10): class com.google.gson.internal.bind.TypeAdapters$10
14:11:16.835 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$10): class com.google.gson.internal.bind.TypeAdapters$10
14:11:16.835 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicIntegerArray)
14:11:16.835 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicIntegerArray): calling childClassLoader.findClass()
14:11:16.835 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicIntegerArray): class java.util.concurrent.atomic.AtomicIntegerArray
14:11:16.835 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.math.BigInteger)
14:11:16.835 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.math.BigInteger): calling childClassLoader.findClass()
14:11:16.835 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.math.BigInteger): class java.math.BigInteger
14:11:16.836 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.StringBuffer)
14:11:16.836 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.StringBuffer): calling childClassLoader.findClass()
14:11:16.836 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.StringBuffer): class java.lang.StringBuffer
14:11:16.836 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.net.URI)
14:11:16.836 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.net.URI): calling childClassLoader.findClass()
14:11:16.836 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.net.URI): class java.net.URI
14:11:16.836 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$34)
14:11:16.836 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$34): calling childClassLoader.findClass()
14:11:16.836 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$34)
14:11:16.836 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$34): calling childClassLoader().findClass() 
14:11:16.836 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$34): class com.google.gson.internal.bind.TypeAdapters$34
14:11:16.836 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$34): class com.google.gson.internal.bind.TypeAdapters$34
14:11:16.836 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$34$1)
14:11:16.836 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$34$1): calling childClassLoader.findClass()
14:11:16.836 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$34$1)
14:11:16.836 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$34$1): calling childClassLoader().findClass() 
14:11:16.836 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$34$1): class com.google.gson.internal.bind.TypeAdapters$34$1
14:11:16.836 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$34$1): class com.google.gson.internal.bind.TypeAdapters$34$1
14:11:16.837 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.UUID)
14:11:16.837 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.UUID): calling childClassLoader.findClass()
14:11:16.837 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.UUID): class java.util.UUID
14:11:16.837 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$25)
14:11:16.837 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$25): calling childClassLoader.findClass()
14:11:16.837 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$25)
14:11:16.837 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$25): calling childClassLoader().findClass() 
14:11:16.837 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$25): class com.google.gson.internal.bind.TypeAdapters$25
14:11:16.837 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$25): class com.google.gson.internal.bind.TypeAdapters$25
14:11:16.837 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Currency)
14:11:16.837 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Currency): calling childClassLoader.findClass()
14:11:16.838 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Currency): class java.util.Currency
14:11:16.838 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Calendar)
14:11:16.838 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Calendar): calling childClassLoader.findClass()
14:11:16.838 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Calendar): class java.util.Calendar
14:11:16.838 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.GregorianCalendar)
14:11:16.838 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.GregorianCalendar): calling childClassLoader.findClass()
14:11:16.838 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.GregorianCalendar): class java.util.GregorianCalendar
14:11:16.838 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$33)
14:11:16.838 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$33): calling childClassLoader.findClass()
14:11:16.838 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$33)
14:11:16.838 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$33): calling childClassLoader().findClass() 
14:11:16.838 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$33): class com.google.gson.internal.bind.TypeAdapters$33
14:11:16.838 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$33): class com.google.gson.internal.bind.TypeAdapters$33
14:11:16.838 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.JsonArray)
14:11:16.838 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.JsonArray): calling childClassLoader.findClass()
14:11:16.838 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.JsonArray)
14:11:16.838 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.JsonArray): calling childClassLoader().findClass() 
14:11:16.839 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Iterable)
14:11:16.839 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Iterable): calling childClassLoader.findClass()
14:11:16.839 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Iterable): interface java.lang.Iterable
14:11:16.839 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.JsonArray): class com.google.gson.JsonArray
14:11:16.839 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.JsonArray): class com.google.gson.JsonArray
14:11:16.839 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.JsonObject)
14:11:16.839 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.JsonObject): calling childClassLoader.findClass()
14:11:16.839 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.JsonObject)
14:11:16.839 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.JsonObject): calling childClassLoader().findClass() 
14:11:16.839 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.JsonObject): class com.google.gson.JsonObject
14:11:16.839 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.JsonObject): class com.google.gson.JsonObject
14:11:16.839 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$29)
14:11:16.839 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$29): calling childClassLoader.findClass()
14:11:16.839 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$29)
14:11:16.839 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$29): calling childClassLoader().findClass() 
14:11:16.839 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$29): class com.google.gson.internal.bind.TypeAdapters$29
14:11:16.840 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$29): class com.google.gson.internal.bind.TypeAdapters$29
14:11:16.840 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$EnumTypeAdapter)
14:11:16.840 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$EnumTypeAdapter): calling childClassLoader.findClass()
14:11:16.840 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$EnumTypeAdapter)
14:11:16.840 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$EnumTypeAdapter): calling childClassLoader().findClass() 
14:11:16.840 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapters$EnumTypeAdapter): class com.google.gson.internal.bind.TypeAdapters$EnumTypeAdapter
14:11:16.840 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapters$EnumTypeAdapter): class com.google.gson.internal.bind.TypeAdapters$EnumTypeAdapter
14:11:16.840 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.ConcurrentHashMap)
14:11:16.840 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.ConcurrentHashMap): calling childClassLoader.findClass()
14:11:16.840 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.ConcurrentHashMap): class java.util.concurrent.ConcurrentHashMap
14:11:16.840 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.ConstructorConstructor)
14:11:16.840 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.ConstructorConstructor): calling childClassLoader.findClass()
14:11:16.840 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.ConstructorConstructor)
14:11:16.840 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.ConstructorConstructor): calling childClassLoader().findClass() 
14:11:16.841 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.ConstructorConstructor): class com.google.gson.internal.ConstructorConstructor
14:11:16.841 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.ConstructorConstructor): class com.google.gson.internal.ConstructorConstructor
14:11:16.841 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.ObjectConstructor)
14:11:16.841 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.ObjectConstructor): calling childClassLoader.findClass()
14:11:16.841 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.ObjectConstructor)
14:11:16.841 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.ObjectConstructor): calling childClassLoader().findClass() 
14:11:16.841 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.ObjectConstructor): interface com.google.gson.internal.ObjectConstructor
14:11:16.841 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.ObjectConstructor): interface com.google.gson.internal.ObjectConstructor
14:11:16.841 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ObjectTypeAdapter)
14:11:16.841 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ObjectTypeAdapter): calling childClassLoader.findClass()
14:11:16.841 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ObjectTypeAdapter)
14:11:16.841 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ObjectTypeAdapter): calling childClassLoader().findClass() 
14:11:16.842 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ObjectTypeAdapter): class com.google.gson.internal.bind.ObjectTypeAdapter
14:11:16.842 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ObjectTypeAdapter): class com.google.gson.internal.bind.ObjectTypeAdapter
14:11:16.842 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ObjectTypeAdapter$1)
14:11:16.842 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ObjectTypeAdapter$1): calling childClassLoader.findClass()
14:11:16.842 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ObjectTypeAdapter$1)
14:11:16.842 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ObjectTypeAdapter$1): calling childClassLoader().findClass() 
14:11:16.842 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ObjectTypeAdapter$1): class com.google.gson.internal.bind.ObjectTypeAdapter$1
14:11:16.842 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ObjectTypeAdapter$1): class com.google.gson.internal.bind.ObjectTypeAdapter$1
14:11:16.842 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.NumberTypeAdapter)
14:11:16.842 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.NumberTypeAdapter): calling childClassLoader.findClass()
14:11:16.842 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.NumberTypeAdapter)
14:11:16.842 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.NumberTypeAdapter): calling childClassLoader().findClass() 
14:11:16.842 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.NumberTypeAdapter): class com.google.gson.internal.bind.NumberTypeAdapter
14:11:16.842 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.NumberTypeAdapter): class com.google.gson.internal.bind.NumberTypeAdapter
14:11:16.843 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.NumberTypeAdapter$1)
14:11:16.843 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.NumberTypeAdapter$1): calling childClassLoader.findClass()
14:11:16.843 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.NumberTypeAdapter$1)
14:11:16.843 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.NumberTypeAdapter$1): calling childClassLoader().findClass() 
14:11:16.843 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.NumberTypeAdapter$1): class com.google.gson.internal.bind.NumberTypeAdapter$1
14:11:16.843 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.NumberTypeAdapter$1): class com.google.gson.internal.bind.NumberTypeAdapter$1
14:11:16.843 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicLong)
14:11:16.843 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicLong): calling childClassLoader.findClass()
14:11:16.843 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicLong): class java.util.concurrent.atomic.AtomicLong
14:11:16.843 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.Gson$4)
14:11:16.843 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.Gson$4): calling childClassLoader.findClass()
14:11:16.843 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.Gson$4)
14:11:16.843 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.Gson$4): calling childClassLoader().findClass() 
14:11:16.843 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.Gson$4): class com.google.gson.Gson$4
14:11:16.843 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.Gson$4): class com.google.gson.Gson$4
14:11:16.843 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicLongArray)
14:11:16.843 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicLongArray): calling childClassLoader.findClass()
14:11:16.843 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.atomic.AtomicLongArray): class java.util.concurrent.atomic.AtomicLongArray
14:11:16.844 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.Gson$5)
14:11:16.844 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.Gson$5): calling childClassLoader.findClass()
14:11:16.844 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.Gson$5)
14:11:16.844 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.Gson$5): calling childClassLoader().findClass() 
14:11:16.844 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.Gson$5): class com.google.gson.Gson$5
14:11:16.844 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.Gson$5): class com.google.gson.Gson$5
14:11:16.844 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DateTypeAdapter)
14:11:16.844 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DateTypeAdapter): calling childClassLoader.findClass()
14:11:16.844 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DateTypeAdapter)
14:11:16.844 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DateTypeAdapter): calling childClassLoader().findClass() 
14:11:16.844 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DateTypeAdapter): class com.google.gson.internal.bind.DateTypeAdapter
14:11:16.844 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DateTypeAdapter): class com.google.gson.internal.bind.DateTypeAdapter
14:11:16.844 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DateTypeAdapter$1)
14:11:16.844 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DateTypeAdapter$1): calling childClassLoader.findClass()
14:11:16.844 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DateTypeAdapter$1)
14:11:16.844 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DateTypeAdapter$1): calling childClassLoader().findClass() 
14:11:16.845 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.DateTypeAdapter$1): class com.google.gson.internal.bind.DateTypeAdapter$1
14:11:16.845 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.DateTypeAdapter$1): class com.google.gson.internal.bind.DateTypeAdapter$1
14:11:16.845 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ArrayTypeAdapter)
14:11:16.845 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ArrayTypeAdapter): calling childClassLoader.findClass()
14:11:16.845 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ArrayTypeAdapter)
14:11:16.845 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ArrayTypeAdapter): calling childClassLoader().findClass() 
14:11:16.845 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ArrayTypeAdapter): class com.google.gson.internal.bind.ArrayTypeAdapter
14:11:16.845 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ArrayTypeAdapter): class com.google.gson.internal.bind.ArrayTypeAdapter
14:11:16.845 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper)
14:11:16.845 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper): calling childClassLoader.findClass()
14:11:16.845 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper)
14:11:16.845 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper): calling childClassLoader().findClass() 
14:11:16.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper): class com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper
14:11:16.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper): class com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper
14:11:16.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ArrayTypeAdapter$1)
14:11:16.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ArrayTypeAdapter$1): calling childClassLoader.findClass()
14:11:16.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ArrayTypeAdapter$1)
14:11:16.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ArrayTypeAdapter$1): calling childClassLoader().findClass() 
14:11:16.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ArrayTypeAdapter$1): class com.google.gson.internal.bind.ArrayTypeAdapter$1
14:11:16.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ArrayTypeAdapter$1): class com.google.gson.internal.bind.ArrayTypeAdapter$1
14:11:16.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory)
14:11:16.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory): calling childClassLoader.findClass()
14:11:16.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory)
14:11:16.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory): calling childClassLoader().findClass() 
14:11:16.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory): class com.google.gson.internal.bind.CollectionTypeAdapterFactory
14:11:16.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory): class com.google.gson.internal.bind.CollectionTypeAdapterFactory
14:11:16.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter)
14:11:16.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter): calling childClassLoader.findClass()
14:11:16.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter)
14:11:16.846 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter): calling childClassLoader().findClass() 
14:11:16.847 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter): class com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter
14:11:16.847 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter): class com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter
14:11:16.847 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.MapTypeAdapterFactory)
14:11:16.847 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.MapTypeAdapterFactory): calling childClassLoader.findClass()
14:11:16.847 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.MapTypeAdapterFactory)
14:11:16.847 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.MapTypeAdapterFactory): calling childClassLoader().findClass() 
14:11:16.847 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.MapTypeAdapterFactory): class com.google.gson.internal.bind.MapTypeAdapterFactory
14:11:16.847 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.MapTypeAdapterFactory): class com.google.gson.internal.bind.MapTypeAdapterFactory
14:11:16.847 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter)
14:11:16.847 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter): calling childClassLoader.findClass()
14:11:16.847 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter)
14:11:16.847 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter): calling childClassLoader().findClass() 
14:11:16.848 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter): class com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter
14:11:16.848 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter): class com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter
14:11:16.848 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.JsonAdapterAnnotationTypeAdapterFactory)
14:11:16.848 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.JsonAdapterAnnotationTypeAdapterFactory): calling childClassLoader.findClass()
14:11:16.848 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.JsonAdapterAnnotationTypeAdapterFactory)
14:11:16.848 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.JsonAdapterAnnotationTypeAdapterFactory): calling childClassLoader().findClass() 
14:11:16.848 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.JsonAdapterAnnotationTypeAdapterFactory): class com.google.gson.internal.bind.JsonAdapterAnnotationTypeAdapterFactory
14:11:16.848 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.JsonAdapterAnnotationTypeAdapterFactory): class com.google.gson.internal.bind.JsonAdapterAnnotationTypeAdapterFactory
14:11:16.848 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TreeTypeAdapter)
14:11:16.848 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TreeTypeAdapter): calling childClassLoader.findClass()
14:11:16.848 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TreeTypeAdapter)
14:11:16.848 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TreeTypeAdapter): calling childClassLoader().findClass() 
14:11:16.848 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.TreeTypeAdapter): class com.google.gson.internal.bind.TreeTypeAdapter
14:11:16.848 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.TreeTypeAdapter): class com.google.gson.internal.bind.TreeTypeAdapter
14:11:16.849 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory)
14:11:16.849 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory): calling childClassLoader.findClass()
14:11:16.849 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory)
14:11:16.849 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory): calling childClassLoader().findClass() 
14:11:16.849 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory): class com.google.gson.internal.bind.ReflectiveTypeAdapterFactory
14:11:16.849 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory): class com.google.gson.internal.bind.ReflectiveTypeAdapterFactory
14:11:16.849 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter)
14:11:16.849 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter): calling childClassLoader.findClass()
14:11:16.849 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter)
14:11:16.849 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter): calling childClassLoader().findClass() 
14:11:16.849 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter): class com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter
14:11:16.849 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter): class com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter
14:11:16.849 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$BoundField)
14:11:16.849 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$BoundField): calling childClassLoader.findClass()
14:11:16.849 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$BoundField)
14:11:16.849 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$BoundField): calling childClassLoader().findClass() 
14:11:16.850 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$BoundField): class com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$BoundField
14:11:16.850 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$BoundField): class com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$BoundField
14:11:16.850 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1)
14:11:16.850 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1): calling childClassLoader.findClass()
14:11:16.850 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1)
14:11:16.850 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1): calling childClassLoader().findClass() 
14:11:16.850 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1): class com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1
14:11:16.850 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1): class com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1
14:11:16.850 [main] DEBUG org.apache.ranger.admin.client.RangerAdminRESTClient -- ==> RangerAdminRESTClient.init(http://127.25.254.212:40687, /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ranger-kms-policymgr-ssl.xml)
14:11:16.850 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRESTClient)
14:11:16.850 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRESTClient): calling childClassLoader.findClass()
14:11:16.850 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRESTClient)
14:11:16.850 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRESTClient): calling childClassLoader().findClass() 
14:11:16.851 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRESTClient): class org.apache.ranger.plugin.util.RangerRESTClient
14:11:16.852 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRESTClient): class org.apache.ranger.plugin.util.RangerRESTClient
14:11:16.852 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.ClientHandlerException)
14:11:16.852 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.ClientHandlerException): calling childClassLoader.findClass()
14:11:16.852 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.api.client.ClientHandlerException)
14:11:16.852 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.ClientHandlerException): calling childClassLoader().findClass() 
14:11:16.852 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.ClientHandlerException): calling componentClassLoader.findClass()
14:11:16.852 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.ClientHandlerException): calling componentClassLoader.loadClass()
14:11:16.853 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.ClientHandlerException): class com.sun.jersey.api.client.ClientHandlerException
14:11:16.853 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.security.NoSuchAlgorithmException)
14:11:16.853 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.security.NoSuchAlgorithmException): calling childClassLoader.findClass()
14:11:16.853 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.security.NoSuchAlgorithmException): class java.security.NoSuchAlgorithmException
14:11:16.853 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.security.KeyStoreException)
14:11:16.853 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.security.KeyStoreException): calling childClassLoader.findClass()
14:11:16.853 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.security.KeyStoreException): class java.security.KeyStoreException
14:11:16.853 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.security.KeyManagementException)
14:11:16.853 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.security.KeyManagementException): calling childClassLoader.findClass()
14:11:16.853 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.security.KeyManagementException): class java.security.KeyManagementException
14:11:16.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.security.cert.CertificateException)
14:11:16.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.security.cert.CertificateException): calling childClassLoader.findClass()
14:11:16.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.security.cert.CertificateException): class java.security.cert.CertificateException
14:11:16.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.FileNotFoundException)
14:11:16.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.FileNotFoundException): calling childClassLoader.findClass()
14:11:16.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.FileNotFoundException): class java.io.FileNotFoundException
14:11:16.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.security.UnrecoverableKeyException)
14:11:16.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.security.UnrecoverableKeyException): calling childClassLoader.findClass()
14:11:16.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.security.UnrecoverableKeyException): class java.security.UnrecoverableKeyException
14:11:16.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.FileInputStream)
14:11:16.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.FileInputStream): calling childClassLoader.findClass()
14:11:16.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.FileInputStream): class java.io.FileInputStream
14:11:16.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.config.ClientConfig)
14:11:16.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.config.ClientConfig): calling childClassLoader.findClass()
14:11:16.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.api.client.config.ClientConfig)
14:11:16.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.config.ClientConfig): calling childClassLoader().findClass() 
14:11:16.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.config.ClientConfig): calling componentClassLoader.findClass()
14:11:16.854 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.config.ClientConfig): calling componentClassLoader.loadClass()
14:11:16.856 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.config.ClientConfig): interface com.sun.jersey.api.client.config.ClientConfig
14:11:16.856 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.net.ssl.HostnameVerifier)
14:11:16.856 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.net.ssl.HostnameVerifier): calling childClassLoader.findClass()
14:11:16.856 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.net.ssl.HostnameVerifier): interface javax.net.ssl.HostnameVerifier
14:11:16.856 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.filter.ClientFilter)
14:11:16.856 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.filter.ClientFilter): calling childClassLoader.findClass()
14:11:16.856 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.api.client.filter.ClientFilter)
14:11:16.856 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.filter.ClientFilter): calling childClassLoader().findClass() 
14:11:16.856 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.filter.ClientFilter): calling componentClassLoader.findClass()
14:11:16.856 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.filter.ClientFilter): calling componentClassLoader.loadClass()
14:11:16.857 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.filter.ClientFilter): class com.sun.jersey.api.client.filter.ClientFilter
14:11:16.857 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.filter.HTTPBasicAuthFilter)
14:11:16.857 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.filter.HTTPBasicAuthFilter): calling childClassLoader.findClass()
14:11:16.858 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.api.client.filter.HTTPBasicAuthFilter)
14:11:16.858 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.filter.HTTPBasicAuthFilter): calling childClassLoader().findClass() 
14:11:16.858 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.filter.HTTPBasicAuthFilter): calling componentClassLoader.findClass()
14:11:16.858 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.filter.HTTPBasicAuthFilter): calling componentClassLoader.loadClass()
14:11:16.859 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.filter.HTTPBasicAuthFilter): class com.sun.jersey.api.client.filter.HTTPBasicAuthFilter
14:11:16.859 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.net.ssl.KeyManagerFactory)
14:11:16.859 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.net.ssl.KeyManagerFactory): calling childClassLoader.findClass()
14:11:16.859 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.net.ssl.KeyManagerFactory): class javax.net.ssl.KeyManagerFactory
14:11:16.859 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.net.ssl.TrustManagerFactory)
14:11:16.859 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.net.ssl.TrustManagerFactory): calling childClassLoader.findClass()
14:11:16.859 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.net.ssl.TrustManagerFactory): class javax.net.ssl.TrustManagerFactory
14:11:16.859 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Random)
14:11:16.859 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Random): calling childClassLoader.findClass()
14:11:16.859 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Random): class java.util.Random
14:11:16.860 [main] DEBUG org.apache.ranger.admin.client.RangerAdminRESTClient -- <== RangerAdminRESTClient.init(http://127.25.254.212:40687, /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ranger-kms-policymgr-ssl.xml)
14:11:16.860 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.URLEncoderUtil)
14:11:16.860 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.URLEncoderUtil): calling childClassLoader.findClass()
14:11:16.860 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.URLEncoderUtil)
14:11:16.860 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.URLEncoderUtil): calling childClassLoader().findClass() 
14:11:16.860 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.URLEncoderUtil): class org.apache.ranger.plugin.util.URLEncoderUtil
14:11:16.860 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.URLEncoderUtil): class org.apache.ranger.plugin.util.URLEncoderUtil
14:11:16.860 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.net.URLEncoder)
14:11:16.860 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.net.URLEncoder): calling childClassLoader.findClass()
14:11:16.860 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.net.URLEncoder): class java.net.URLEncoder
14:11:16.860 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPluginContext -- <== RangerBasePlugin.createAdminClient(kms, kms, ranger.plugin.kms): policySourceImpl=org.apache.ranger.admin.client.RangerAdminRESTClient, client=org.apache.ranger.admin.client.RangerAdminRESTClient@2c63a4e4
14:11:16.860 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRolesProvider)
14:11:16.860 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRolesProvider): calling childClassLoader.findClass()
14:11:16.860 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRolesProvider)
14:11:16.860 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRolesProvider): calling childClassLoader().findClass() 
14:11:16.861 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRolesProvider): class org.apache.ranger.plugin.util.RangerRolesProvider
14:11:16.861 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRolesProvider): class org.apache.ranger.plugin.util.RangerRolesProvider
14:11:16.861 [main] DEBUG org.apache.ranger.plugin.util.RangerRolesProvider -- ==> RangerRolesProvider(serviceName=kms).RangerRolesProvider()
14:11:16.861 [main] DEBUG org.apache.ranger.plugin.util.RangerRolesProvider -- <== RangerRolesProvider(serviceName=kms).RangerRolesProvider()
14:11:16.861 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- <== PolicyRefresher(serviceName=kms).PolicyRefresher()
14:11:16.861 [main] INFO org.apache.ranger.plugin.service.RangerBasePlugin -- Created PolicyRefresher Thread(PolicyRefresher(serviceName=kms)-24)
14:11:16.861 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- ==> PolicyRefresher(serviceName=kms).loadRoles()
14:11:16.861 [main] DEBUG org.apache.ranger.plugin.util.RangerRolesProvider -- ==> RangerRolesProvider(serviceName= kms serviceType= kms).loadUserGroupRoles()
14:11:16.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPerfTracerFactory)
14:11:16.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPerfTracerFactory): calling childClassLoader.findClass()
14:11:16.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPerfTracerFactory)
14:11:16.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPerfTracerFactory): calling childClassLoader().findClass() 
14:11:16.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPerfTracerFactory): class org.apache.ranger.plugin.util.RangerPerfTracerFactory
14:11:16.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPerfTracerFactory): class org.apache.ranger.plugin.util.RangerPerfTracerFactory
14:11:16.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPerfCollectorTracer)
14:11:16.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPerfCollectorTracer): calling childClassLoader.findClass()
14:11:16.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPerfCollectorTracer)
14:11:16.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPerfCollectorTracer): calling childClassLoader().findClass() 
14:11:16.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPerfCollectorTracer): class org.apache.ranger.plugin.util.RangerPerfCollectorTracer
14:11:16.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPerfCollectorTracer): class org.apache.ranger.plugin.util.RangerPerfCollectorTracer
14:11:16.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.management.ManagementFactory)
14:11:16.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.management.ManagementFactory): calling childClassLoader.findClass()
14:11:16.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.management.ManagementFactory): class java.lang.management.ManagementFactory
14:11:16.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.management.ThreadMXBean)
14:11:16.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.management.ThreadMXBean): calling childClassLoader.findClass()
14:11:16.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.management.ThreadMXBean): interface java.lang.management.ThreadMXBean
14:11:16.862 [main] INFO org.apache.ranger.perf.policyengine.init -- ThreadCPUTimeSupported (by JVM)  = true
14:11:16.863 [main] INFO org.apache.ranger.perf.policyengine.init -- ThreadCPUTimeEnabled  = true
14:11:16.863 [main] INFO org.apache.ranger.perf.policyengine.init -- ThreadCPUTimeEnabled  = true
14:11:16.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.management.ThreadInfo)
14:11:16.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.management.ThreadInfo): calling childClassLoader.findClass()
14:11:16.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.management.ThreadInfo): class java.lang.management.ThreadInfo
14:11:16.863 [main] DEBUG org.apache.ranger.perf.policyengine.init -- In-Use memory: 103188440, Free memory:211384360
14:11:16.863 [main] DEBUG org.apache.ranger.plugin.util.RangerRolesProvider -- ==> RangerRolesProvider(serviceName=kms).loadUserGroupRolesFromAdmin()
14:11:16.863 [main] DEBUG org.apache.ranger.admin.client.RangerAdminRESTClient -- ==> RangerAdminRESTClient.getRolesIfUpdated(-1, 0)
14:11:16.863 [main] DEBUG org.apache.ranger.admin.client.RangerAdminRESTClient -- Checking Roles updated as user : rangerkms/127.25.254.212@KRBTEST.COM (auth:KERBEROS)
14:11:16.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.invoke.LambdaMetafactory)
14:11:16.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.invoke.LambdaMetafactory): calling childClassLoader.findClass()
14:11:16.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.invoke.LambdaMetafactory): class java.lang.invoke.LambdaMetafactory
14:11:16.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.security.PrivilegedExceptionAction)
14:11:16.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.security.PrivilegedExceptionAction): calling childClassLoader.findClass()
14:11:16.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.security.PrivilegedExceptionAction): interface java.security.PrivilegedExceptionAction
14:11:16.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.ClientResponse)
14:11:16.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.ClientResponse): calling childClassLoader.findClass()
14:11:16.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.api.client.ClientResponse)
14:11:16.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.ClientResponse): calling childClassLoader().findClass() 
14:11:16.864 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.ClientResponse): calling componentClassLoader.findClass()
14:11:16.864 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.ClientResponse): calling componentClassLoader.loadClass()
14:11:16.865 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.ClientResponse): class com.sun.jersey.api.client.ClientResponse
14:11:16.866 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: rangerkms/127.25.254.212@KRBTEST.COM (auth:KERBEROS)][action: org.apache.ranger.admin.client.RangerAdminRESTClient$$Lambda$157/0x00007f3d08286dd0@50a2b309]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.ranger.audit.provider.MiscUtil.executePrivilegedAction(MiscUtil.java:560)
	at org.apache.ranger.admin.client.RangerAdminRESTClient.getRolesIfUpdated(RangerAdminRESTClient.java:221)
	at org.apache.ranger.plugin.util.RangerRolesProvider.loadUserGroupRolesFromAdmin(RangerRolesProvider.java:172)
	at org.apache.ranger.plugin.util.RangerRolesProvider.loadUserGroupRoles(RangerRolesProvider.java:112)
	at org.apache.ranger.plugin.util.PolicyRefresher.loadRoles(PolicyRefresher.java:563)
	at org.apache.ranger.plugin.util.PolicyRefresher.startRefresher(PolicyRefresher.java:138)
	at org.apache.ranger.plugin.service.RangerBasePlugin.init(RangerBasePlugin.java:310)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin.init(RangerKmsAuthorizer.java:346)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.init(RangerKmsAuthorizer.java:303)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.<init>(RangerKmsAuthorizer.java:127)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.<init>(RangerKmsAuthorizer.java:153)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500)
	at java.base/java.lang.reflect.ReflectAccess.newInstance(ReflectAccess.java:128)
	at java.base/jdk.internal.reflect.ReflectionFactory.newInstance(ReflectionFactory.java:347)
	at java.base/java.lang.Class.newInstance(Class.java:647)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.init(RangerKmsAuthorizer.java:70)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.<init>(RangerKmsAuthorizer.java:50)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500)
	at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:481)
	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
	at org.apache.hadoop.crypto.key.kms.server.KMSWebApp.getKeyAcls(KMSWebApp.java:254)
	at org.apache.hadoop.crypto.key.kms.server.KMSWebApp.contextInitialized(KMSWebApp.java:143)
	at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4018)
	at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:4460)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1203)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1193)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:145)
	at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:749)
	at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:721)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1203)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1193)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:145)
	at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:749)
	at org.apache.catalina.core.StandardEngine.startInternal(StandardEngine.java:211)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.StandardService.startInternal(StandardService.java:415)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.StandardServer.startInternal(StandardServer.java:874)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.startup.Tomcat.start(Tomcat.java:438)
	at org.apache.ranger.server.tomcat.EmbeddedServer.startServer(EmbeddedServer.java:351)
	at org.apache.ranger.server.tomcat.EmbeddedServer.start(EmbeddedServer.java:317)
	at org.apache.ranger.server.tomcat.EmbeddedServer.main(EmbeddedServer.java:95)
14:11:16.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.config.DefaultClientConfig)
14:11:16.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.config.DefaultClientConfig): calling childClassLoader.findClass()
14:11:16.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.api.client.config.DefaultClientConfig)
14:11:16.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.config.DefaultClientConfig): calling childClassLoader().findClass() 
14:11:16.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.config.DefaultClientConfig): calling componentClassLoader.findClass()
14:11:16.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.config.DefaultClientConfig): calling componentClassLoader.loadClass()
14:11:16.870 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.config.DefaultClientConfig): class com.sun.jersey.api.client.config.DefaultClientConfig
14:11:16.870 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider)
14:11:16.870 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider): calling childClassLoader.findClass()
14:11:16.870 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider)
14:11:16.870 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider): calling childClassLoader().findClass() 
14:11:16.870 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider): calling componentClassLoader.findClass()
14:11:16.870 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider): calling componentClassLoader.loadClass()
14:11:16.870 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider): class com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider
14:11:16.870 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.Client)
14:11:16.870 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.Client): calling childClassLoader.findClass()
14:11:16.871 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.api.client.Client)
14:11:16.871 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.Client): calling childClassLoader().findClass() 
14:11:16.871 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.Client): calling componentClassLoader.findClass()
14:11:16.871 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.Client): calling componentClassLoader.loadClass()
14:11:16.872 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.Client): class com.sun.jersey.api.client.Client
14:11:16.881 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/jersey-client-components) 
14:11:16.881 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/jersey-client-components): calling childClassLoader.findResources()
14:11:16.882 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/jersey-client-components): calling componentClassLoader.getResources()
14:11:16.882 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/jersey-client-components): java.lang.CompoundEnumeration@298413d4
14:11:16.882 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/jersey-client-components) 
14:11:16.885 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider) 
14:11:16.885 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider): calling childClassLoader.findResources()
14:11:16.885 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider): calling componentClassLoader.getResources()
14:11:16.886 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider): java.lang.CompoundEnumeration@3f908634
14:11:16.886 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider) 
14:11:16.888 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider) 
14:11:16.888 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider): calling childClassLoader.findResources()
14:11:16.888 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider): calling componentClassLoader.getResources()
14:11:16.888 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider): java.lang.CompoundEnumeration@75465bd4
14:11:16.888 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider) 
14:11:16.890 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.SAXParserContextProvider)
14:11:16.890 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.SAXParserContextProvider): calling childClassLoader.findClass()
14:11:16.891 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.SAXParserContextProvider)
14:11:16.891 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.SAXParserContextProvider): calling childClassLoader().findClass() 
14:11:16.891 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.SAXParserContextProvider): calling componentClassLoader.findClass()
14:11:16.891 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.SAXParserContextProvider): calling componentClassLoader.loadClass()
14:11:16.892 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.SAXParserContextProvider): class com.sun.jersey.core.impl.provider.xml.SAXParserContextProvider
14:11:16.892 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.XMLStreamReaderContextProvider)
14:11:16.892 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.XMLStreamReaderContextProvider): calling childClassLoader.findClass()
14:11:16.892 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.XMLStreamReaderContextProvider)
14:11:16.892 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.XMLStreamReaderContextProvider): calling childClassLoader().findClass() 
14:11:16.892 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.XMLStreamReaderContextProvider): calling componentClassLoader.findClass()
14:11:16.892 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.XMLStreamReaderContextProvider): calling componentClassLoader.loadClass()
14:11:16.893 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.XMLStreamReaderContextProvider): class com.sun.jersey.core.impl.provider.xml.XMLStreamReaderContextProvider
14:11:16.893 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.DocumentBuilderFactoryProvider)
14:11:16.893 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.DocumentBuilderFactoryProvider): calling childClassLoader.findClass()
14:11:16.893 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.DocumentBuilderFactoryProvider)
14:11:16.893 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.DocumentBuilderFactoryProvider): calling childClassLoader().findClass() 
14:11:16.893 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.DocumentBuilderFactoryProvider): calling componentClassLoader.findClass()
14:11:16.893 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.DocumentBuilderFactoryProvider): calling componentClassLoader.loadClass()
14:11:16.894 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.DocumentBuilderFactoryProvider): class com.sun.jersey.core.impl.provider.xml.DocumentBuilderFactoryProvider
14:11:16.895 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.TransformerFactoryProvider)
14:11:16.895 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.TransformerFactoryProvider): calling childClassLoader.findClass()
14:11:16.895 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.TransformerFactoryProvider)
14:11:16.895 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.TransformerFactoryProvider): calling childClassLoader().findClass() 
14:11:16.895 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.xml.TransformerFactoryProvider): calling componentClassLoader.findClass()
14:11:16.895 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.TransformerFactoryProvider): calling componentClassLoader.loadClass()
14:11:16.896 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.xml.TransformerFactoryProvider): class com.sun.jersey.core.impl.provider.xml.TransformerFactoryProvider
14:11:16.898 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.annotation.PostConstruct)
14:11:16.898 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.annotation.PostConstruct): calling childClassLoader.findClass()
14:11:16.898 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(javax.annotation.PostConstruct)
14:11:16.899 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.annotation.PostConstruct): calling childClassLoader().findClass() 
14:11:16.899 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.annotation.PostConstruct): calling componentClassLoader.findClass()
14:11:16.899 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.annotation.PostConstruct): calling componentClassLoader.loadClass()
14:11:16.899 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.annotation.PostConstruct): interface javax.annotation.PostConstruct
14:11:16.908 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(javax.annotation.PreDestroy)
14:11:16.908 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.annotation.PreDestroy): calling childClassLoader.findClass()
14:11:16.908 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(javax.annotation.PreDestroy)
14:11:16.908 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.annotation.PreDestroy): calling childClassLoader().findClass() 
14:11:16.908 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(javax.annotation.PreDestroy): calling componentClassLoader.findClass()
14:11:16.908 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(javax.annotation.PreDestroy): calling componentClassLoader.loadClass()
14:11:16.908 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(javax.annotation.PreDestroy): interface javax.annotation.PreDestroy
14:11:16.990 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResource(META-INF/services/javax.ws.rs.ext.RuntimeDelegate) 
14:11:16.990 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResource(META-INF/services/javax.ws.rs.ext.RuntimeDelegate): calling componentClassLoader.getResources()
14:11:16.991 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResource(META-INF/services/javax.ws.rs.ext.RuntimeDelegate): jar:file:/tmp/dist-test-taskMMfo7I/thirdparty/src/ranger-2.6.0-kms/ews/lib/jersey-bundle-1.19.4.jar!/META-INF/services/javax.ws.rs.ext.RuntimeDelegate
14:11:16.991 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.server.impl.provider.RuntimeDelegateImpl)
14:11:16.991 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.server.impl.provider.RuntimeDelegateImpl): calling childClassLoader.findClass()
14:11:16.991 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.server.impl.provider.RuntimeDelegateImpl)
14:11:16.991 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.server.impl.provider.RuntimeDelegateImpl): calling childClassLoader().findClass() 
14:11:16.991 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.server.impl.provider.RuntimeDelegateImpl): calling componentClassLoader.findClass()
14:11:16.991 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.server.impl.provider.RuntimeDelegateImpl): calling componentClassLoader.loadClass()
14:11:16.993 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.server.impl.provider.RuntimeDelegateImpl): class com.sun.jersey.server.impl.provider.RuntimeDelegateImpl
14:11:16.997 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.spi.HeaderDelegateProvider) 
14:11:16.997 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/com.sun.jersey.spi.HeaderDelegateProvider): calling childClassLoader.findResources()
14:11:16.997 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.spi.HeaderDelegateProvider): calling componentClassLoader.getResources()
14:11:16.997 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.spi.HeaderDelegateProvider): java.lang.CompoundEnumeration@15473142
14:11:16.997 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.spi.HeaderDelegateProvider) 
14:11:16.999 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.LocaleProvider)
14:11:16.999 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.LocaleProvider): calling childClassLoader.findClass()
14:11:16.999 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.LocaleProvider)
14:11:16.999 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.LocaleProvider): calling childClassLoader().findClass() 
14:11:16.999 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.LocaleProvider): calling componentClassLoader.findClass()
14:11:16.999 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.LocaleProvider): calling componentClassLoader.loadClass()
14:11:17.000 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.LocaleProvider): class com.sun.jersey.core.impl.provider.header.LocaleProvider
14:11:17.000 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.EntityTagProvider)
14:11:17.000 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.EntityTagProvider): calling childClassLoader.findClass()
14:11:17.000 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.EntityTagProvider)
14:11:17.000 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.EntityTagProvider): calling childClassLoader().findClass() 
14:11:17.001 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.EntityTagProvider): calling componentClassLoader.findClass()
14:11:17.001 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.EntityTagProvider): calling componentClassLoader.loadClass()
14:11:17.002 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.EntityTagProvider): class com.sun.jersey.core.impl.provider.header.EntityTagProvider
14:11:17.003 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.MediaTypeProvider)
14:11:17.003 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.MediaTypeProvider): calling childClassLoader.findClass()
14:11:17.003 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.MediaTypeProvider)
14:11:17.003 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.MediaTypeProvider): calling childClassLoader().findClass() 
14:11:17.003 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.MediaTypeProvider): calling componentClassLoader.findClass()
14:11:17.003 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.MediaTypeProvider): calling componentClassLoader.loadClass()
14:11:17.004 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.MediaTypeProvider): class com.sun.jersey.core.impl.provider.header.MediaTypeProvider
14:11:17.005 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.CacheControlProvider)
14:11:17.005 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.CacheControlProvider): calling childClassLoader.findClass()
14:11:17.005 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.CacheControlProvider)
14:11:17.005 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.CacheControlProvider): calling childClassLoader().findClass() 
14:11:17.005 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.CacheControlProvider): calling componentClassLoader.findClass()
14:11:17.005 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.CacheControlProvider): calling componentClassLoader.loadClass()
14:11:17.006 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.CacheControlProvider): class com.sun.jersey.core.impl.provider.header.CacheControlProvider
14:11:17.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.NewCookieProvider)
14:11:17.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.NewCookieProvider): calling childClassLoader.findClass()
14:11:17.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.NewCookieProvider)
14:11:17.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.NewCookieProvider): calling childClassLoader().findClass() 
14:11:17.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.NewCookieProvider): calling componentClassLoader.findClass()
14:11:17.007 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.NewCookieProvider): calling componentClassLoader.loadClass()
14:11:17.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.NewCookieProvider): class com.sun.jersey.core.impl.provider.header.NewCookieProvider
14:11:17.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.CookieProvider)
14:11:17.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.CookieProvider): calling childClassLoader.findClass()
14:11:17.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.CookieProvider)
14:11:17.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.CookieProvider): calling childClassLoader().findClass() 
14:11:17.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.CookieProvider): calling componentClassLoader.findClass()
14:11:17.009 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.CookieProvider): calling componentClassLoader.loadClass()
14:11:17.010 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.CookieProvider): class com.sun.jersey.core.impl.provider.header.CookieProvider
14:11:17.011 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.URIProvider)
14:11:17.011 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.URIProvider): calling childClassLoader.findClass()
14:11:17.011 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.URIProvider)
14:11:17.011 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.URIProvider): calling childClassLoader().findClass() 
14:11:17.011 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.URIProvider): calling componentClassLoader.findClass()
14:11:17.011 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.URIProvider): calling componentClassLoader.loadClass()
14:11:17.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.URIProvider): class com.sun.jersey.core.impl.provider.header.URIProvider
14:11:17.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.DateProvider)
14:11:17.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.DateProvider): calling childClassLoader.findClass()
14:11:17.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.DateProvider)
14:11:17.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.DateProvider): calling childClassLoader().findClass() 
14:11:17.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.DateProvider): calling componentClassLoader.findClass()
14:11:17.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.DateProvider): calling componentClassLoader.loadClass()
14:11:17.014 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.DateProvider): class com.sun.jersey.core.impl.provider.header.DateProvider
14:11:17.014 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.StringProvider)
14:11:17.014 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.StringProvider): calling childClassLoader.findClass()
14:11:17.014 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.StringProvider)
14:11:17.014 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.StringProvider): calling childClassLoader().findClass() 
14:11:17.014 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.header.StringProvider): calling componentClassLoader.findClass()
14:11:17.014 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.StringProvider): calling componentClassLoader.loadClass()
14:11:17.015 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.header.StringProvider): class com.sun.jersey.core.impl.provider.header.StringProvider
14:11:17.026 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/javax.ws.rs.ext.MessageBodyReader) 
14:11:17.026 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyReader): calling childClassLoader.findResources()
14:11:17.026 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyReader): calling componentClassLoader.getResources()
14:11:17.027 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyReader): java.lang.CompoundEnumeration@3e003d67
14:11:17.027 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/javax.ws.rs.ext.MessageBodyReader) 
14:11:17.030 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.StringProvider)
14:11:17.030 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.StringProvider): calling childClassLoader.findClass()
14:11:17.030 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.StringProvider)
14:11:17.030 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.StringProvider): calling childClassLoader().findClass() 
14:11:17.030 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.StringProvider): calling componentClassLoader.findClass()
14:11:17.030 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.StringProvider): calling componentClassLoader.loadClass()
14:11:17.032 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.StringProvider): class com.sun.jersey.core.impl.provider.entity.StringProvider
14:11:17.032 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.ByteArrayProvider)
14:11:17.032 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.ByteArrayProvider): calling childClassLoader.findClass()
14:11:17.032 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.ByteArrayProvider)
14:11:17.032 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.ByteArrayProvider): calling childClassLoader().findClass() 
14:11:17.032 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.ByteArrayProvider): calling componentClassLoader.findClass()
14:11:17.032 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.ByteArrayProvider): calling componentClassLoader.loadClass()
14:11:17.033 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.ByteArrayProvider): class com.sun.jersey.core.impl.provider.entity.ByteArrayProvider
14:11:17.033 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FileProvider)
14:11:17.033 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FileProvider): calling childClassLoader.findClass()
14:11:17.033 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.FileProvider)
14:11:17.033 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.FileProvider): calling childClassLoader().findClass() 
14:11:17.033 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.FileProvider): calling componentClassLoader.findClass()
14:11:17.034 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FileProvider): calling componentClassLoader.loadClass()
14:11:17.035 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FileProvider): class com.sun.jersey.core.impl.provider.entity.FileProvider
14:11:17.035 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.InputStreamProvider)
14:11:17.035 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.InputStreamProvider): calling childClassLoader.findClass()
14:11:17.035 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.InputStreamProvider)
14:11:17.035 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.InputStreamProvider): calling childClassLoader().findClass() 
14:11:17.035 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.InputStreamProvider): calling componentClassLoader.findClass()
14:11:17.035 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.InputStreamProvider): calling componentClassLoader.loadClass()
14:11:17.037 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.InputStreamProvider): class com.sun.jersey.core.impl.provider.entity.InputStreamProvider
14:11:17.037 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.DataSourceProvider)
14:11:17.037 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.DataSourceProvider): calling childClassLoader.findClass()
14:11:17.037 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.DataSourceProvider)
14:11:17.037 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.DataSourceProvider): calling childClassLoader().findClass() 
14:11:17.037 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.DataSourceProvider): calling componentClassLoader.findClass()
14:11:17.037 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.DataSourceProvider): calling componentClassLoader.loadClass()
14:11:17.038 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.DataSourceProvider): class com.sun.jersey.core.impl.provider.entity.DataSourceProvider
14:11:17.038 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.RenderedImageProvider)
14:11:17.038 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.RenderedImageProvider): calling childClassLoader.findClass()
14:11:17.038 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.RenderedImageProvider)
14:11:17.038 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.RenderedImageProvider): calling childClassLoader().findClass() 
14:11:17.038 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.RenderedImageProvider): calling componentClassLoader.findClass()
14:11:17.038 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.RenderedImageProvider): calling componentClassLoader.loadClass()
14:11:17.040 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.RenderedImageProvider): class com.sun.jersey.core.impl.provider.entity.RenderedImageProvider
14:11:17.040 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.MimeMultipartProvider)
14:11:17.040 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.MimeMultipartProvider): calling childClassLoader.findClass()
14:11:17.040 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.MimeMultipartProvider)
14:11:17.040 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.MimeMultipartProvider): calling childClassLoader().findClass() 
14:11:17.040 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.MimeMultipartProvider): calling componentClassLoader.findClass()
14:11:17.040 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.MimeMultipartProvider): calling componentClassLoader.loadClass()
14:11:17.041 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.MimeMultipartProvider): class com.sun.jersey.core.impl.provider.entity.MimeMultipartProvider
14:11:17.041 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FormProvider)
14:11:17.041 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FormProvider): calling childClassLoader.findClass()
14:11:17.041 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.FormProvider)
14:11:17.041 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.FormProvider): calling childClassLoader().findClass() 
14:11:17.041 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.FormProvider): calling componentClassLoader.findClass()
14:11:17.041 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FormProvider): calling componentClassLoader.loadClass()
14:11:17.043 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FormProvider): class com.sun.jersey.core.impl.provider.entity.FormProvider
14:11:17.043 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FormMultivaluedMapProvider)
14:11:17.043 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FormMultivaluedMapProvider): calling childClassLoader.findClass()
14:11:17.043 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.FormMultivaluedMapProvider)
14:11:17.043 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.FormMultivaluedMapProvider): calling childClassLoader().findClass() 
14:11:17.043 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.FormMultivaluedMapProvider): calling componentClassLoader.findClass()
14:11:17.043 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FormMultivaluedMapProvider): calling componentClassLoader.loadClass()
14:11:17.045 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.FormMultivaluedMapProvider): class com.sun.jersey.core.impl.provider.entity.FormMultivaluedMapProvider
14:11:17.045 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$App)
14:11:17.045 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$App): calling childClassLoader.findClass()
14:11:17.045 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$App)
14:11:17.045 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$App): calling childClassLoader().findClass() 
14:11:17.045 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$App): calling componentClassLoader.findClass()
14:11:17.045 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$App): calling componentClassLoader.loadClass()
14:11:17.047 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$App): class com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$App
14:11:17.047 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$Text)
14:11:17.047 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$Text): calling childClassLoader.findClass()
14:11:17.047 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$Text)
14:11:17.047 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$Text): calling childClassLoader().findClass() 
14:11:17.048 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$Text): calling componentClassLoader.findClass()
14:11:17.048 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$Text): calling componentClassLoader.loadClass()
14:11:17.049 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$Text): class com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$Text
14:11:17.049 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$General)
14:11:17.049 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$General): calling childClassLoader.findClass()
14:11:17.049 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$General)
14:11:17.049 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$General): calling childClassLoader().findClass() 
14:11:17.049 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$General): calling componentClassLoader.findClass()
14:11:17.049 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$General): calling componentClassLoader.loadClass()
14:11:17.050 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$General): class com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$General
14:11:17.050 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$App)
14:11:17.050 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$App): calling childClassLoader.findClass()
14:11:17.050 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$App)
14:11:17.050 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$App): calling childClassLoader().findClass() 
14:11:17.050 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$App): calling componentClassLoader.findClass()
14:11:17.050 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$App): calling componentClassLoader.loadClass()
14:11:17.052 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$App): class com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$App
14:11:17.052 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$Text)
14:11:17.052 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$Text): calling childClassLoader.findClass()
14:11:17.052 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$Text)
14:11:17.052 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$Text): calling childClassLoader().findClass() 
14:11:17.052 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$Text): calling componentClassLoader.findClass()
14:11:17.052 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$Text): calling componentClassLoader.loadClass()
14:11:17.054 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$Text): class com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$Text
14:11:17.054 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$General)
14:11:17.054 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$General): calling childClassLoader.findClass()
14:11:17.054 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$General)
14:11:17.054 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$General): calling childClassLoader().findClass() 
14:11:17.054 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$General): calling componentClassLoader.findClass()
14:11:17.054 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$General): calling componentClassLoader.loadClass()
14:11:17.055 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$General): class com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$General
14:11:17.055 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$App)
14:11:17.055 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$App): calling childClassLoader.findClass()
14:11:17.055 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$App)
14:11:17.055 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$App): calling childClassLoader().findClass() 
14:11:17.055 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$App): calling componentClassLoader.findClass()
14:11:17.055 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$App): calling componentClassLoader.loadClass()
14:11:17.057 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$App): class com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$App
14:11:17.057 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$Text)
14:11:17.057 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$Text): calling childClassLoader.findClass()
14:11:17.057 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$Text)
14:11:17.057 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$Text): calling childClassLoader().findClass() 
14:11:17.057 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$Text): calling componentClassLoader.findClass()
14:11:17.057 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$Text): calling componentClassLoader.loadClass()
14:11:17.059 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$Text): class com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$Text
14:11:17.059 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$General)
14:11:17.059 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$General): calling childClassLoader.findClass()
14:11:17.059 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$General)
14:11:17.059 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$General): calling childClassLoader().findClass() 
14:11:17.059 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$General): calling componentClassLoader.findClass()
14:11:17.059 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$General): calling componentClassLoader.loadClass()
14:11:17.060 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$General): class com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$General
14:11:17.060 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.ReaderProvider)
14:11:17.060 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.ReaderProvider): calling childClassLoader.findClass()
14:11:17.060 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.ReaderProvider)
14:11:17.060 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.ReaderProvider): calling childClassLoader().findClass() 
14:11:17.060 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.ReaderProvider): calling componentClassLoader.findClass()
14:11:17.060 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.ReaderProvider): calling componentClassLoader.loadClass()
14:11:17.061 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.ReaderProvider): class com.sun.jersey.core.impl.provider.entity.ReaderProvider
14:11:17.062 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.DocumentProvider)
14:11:17.062 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.DocumentProvider): calling childClassLoader.findClass()
14:11:17.062 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.DocumentProvider)
14:11:17.062 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.DocumentProvider): calling childClassLoader().findClass() 
14:11:17.062 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.DocumentProvider): calling componentClassLoader.findClass()
14:11:17.062 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.DocumentProvider): calling componentClassLoader.loadClass()
14:11:17.063 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.DocumentProvider): class com.sun.jersey.core.impl.provider.entity.DocumentProvider
14:11:17.063 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$StreamSourceReader)
14:11:17.063 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$StreamSourceReader): calling childClassLoader.findClass()
14:11:17.063 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$StreamSourceReader)
14:11:17.063 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$StreamSourceReader): calling childClassLoader().findClass() 
14:11:17.063 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$StreamSourceReader): calling componentClassLoader.findClass()
14:11:17.063 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$StreamSourceReader): calling componentClassLoader.loadClass()
14:11:17.065 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$StreamSourceReader): class com.sun.jersey.core.impl.provider.entity.SourceProvider$StreamSourceReader
14:11:17.065 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SAXSourceReader)
14:11:17.065 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SAXSourceReader): calling childClassLoader.findClass()
14:11:17.065 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SAXSourceReader)
14:11:17.065 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SAXSourceReader): calling childClassLoader().findClass() 
14:11:17.065 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SAXSourceReader): calling componentClassLoader.findClass()
14:11:17.065 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SAXSourceReader): calling componentClassLoader.loadClass()
14:11:17.066 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SAXSourceReader): class com.sun.jersey.core.impl.provider.entity.SourceProvider$SAXSourceReader
14:11:17.066 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$DOMSourceReader)
14:11:17.066 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$DOMSourceReader): calling childClassLoader.findClass()
14:11:17.066 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$DOMSourceReader)
14:11:17.066 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$DOMSourceReader): calling childClassLoader().findClass() 
14:11:17.066 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$DOMSourceReader): calling componentClassLoader.findClass()
14:11:17.067 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$DOMSourceReader): calling componentClassLoader.loadClass()
14:11:17.068 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$DOMSourceReader): class com.sun.jersey.core.impl.provider.entity.SourceProvider$DOMSourceReader
14:11:17.068 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$App)
14:11:17.068 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$App): calling childClassLoader.findClass()
14:11:17.068 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$App)
14:11:17.068 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$App): calling childClassLoader().findClass() 
14:11:17.068 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$App): calling componentClassLoader.findClass()
14:11:17.068 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$App): calling componentClassLoader.loadClass()
14:11:17.070 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$App): class com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$App
14:11:17.070 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$Text)
14:11:17.070 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$Text): calling childClassLoader.findClass()
14:11:17.070 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$Text)
14:11:17.070 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$Text): calling childClassLoader().findClass() 
14:11:17.070 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$Text): calling componentClassLoader.findClass()
14:11:17.070 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$Text): calling componentClassLoader.loadClass()
14:11:17.071 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$Text): class com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$Text
14:11:17.071 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$General)
14:11:17.071 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$General): calling childClassLoader.findClass()
14:11:17.071 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$General)
14:11:17.071 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$General): calling childClassLoader().findClass() 
14:11:17.071 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$General): calling componentClassLoader.findClass()
14:11:17.071 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$General): calling componentClassLoader.loadClass()
14:11:17.072 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$General): class com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$General
14:11:17.072 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.EntityHolderReader)
14:11:17.072 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.EntityHolderReader): calling childClassLoader.findClass()
14:11:17.072 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.EntityHolderReader)
14:11:17.072 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.EntityHolderReader): calling childClassLoader().findClass() 
14:11:17.072 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.EntityHolderReader): calling componentClassLoader.findClass()
14:11:17.072 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.EntityHolderReader): calling componentClassLoader.loadClass()
14:11:17.074 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.EntityHolderReader): class com.sun.jersey.core.impl.provider.entity.EntityHolderReader
14:11:17.074 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomFeedProvider)
14:11:17.074 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomFeedProvider): calling childClassLoader.findClass()
14:11:17.074 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomFeedProvider)
14:11:17.074 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomFeedProvider): calling childClassLoader().findClass() 
14:11:17.074 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomFeedProvider): calling componentClassLoader.findClass()
14:11:17.074 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomFeedProvider): calling componentClassLoader.loadClass()
14:11:17.075 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomFeedProvider): class com.sun.jersey.atom.rome.impl.provider.entity.AtomFeedProvider
14:11:17.075 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomEntryProvider)
14:11:17.075 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomEntryProvider): calling childClassLoader.findClass()
14:11:17.075 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomEntryProvider)
14:11:17.075 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomEntryProvider): calling childClassLoader().findClass() 
14:11:17.075 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomEntryProvider): calling componentClassLoader.findClass()
14:11:17.075 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomEntryProvider): calling componentClassLoader.loadClass()
14:11:17.077 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.atom.rome.impl.provider.entity.AtomEntryProvider): class com.sun.jersey.atom.rome.impl.provider.entity.AtomEntryProvider
14:11:17.077 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$Wadl)
14:11:17.077 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$Wadl): calling childClassLoader.findClass()
14:11:17.077 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$Wadl)
14:11:17.077 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$Wadl): calling childClassLoader().findClass() 
14:11:17.077 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$Wadl): calling componentClassLoader.findClass()
14:11:17.077 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$Wadl): calling componentClassLoader.loadClass()
14:11:17.078 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$Wadl): class com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$Wadl
14:11:17.079 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$App)
14:11:17.079 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$App): calling childClassLoader.findClass()
14:11:17.079 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$App)
14:11:17.079 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$App): calling childClassLoader().findClass() 
14:11:17.079 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$App): calling componentClassLoader.findClass()
14:11:17.079 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$App): calling componentClassLoader.loadClass()
14:11:17.080 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$App): class com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$App
14:11:17.080 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$General)
14:11:17.080 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$General): calling childClassLoader.findClass()
14:11:17.080 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$General)
14:11:17.080 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$General): calling childClassLoader().findClass() 
14:11:17.080 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$General): calling componentClassLoader.findClass()
14:11:17.080 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$General): calling componentClassLoader.loadClass()
14:11:17.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$General): class com.sun.jersey.json.impl.provider.entity.JSONRootElementProvider$General
14:11:17.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$App)
14:11:17.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$App): calling childClassLoader.findClass()
14:11:17.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$App)
14:11:17.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$App): calling childClassLoader().findClass() 
14:11:17.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$App): calling componentClassLoader.findClass()
14:11:17.081 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$App): calling componentClassLoader.loadClass()
14:11:17.083 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$App): class com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$App
14:11:17.083 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$General)
14:11:17.083 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$General): calling childClassLoader.findClass()
14:11:17.083 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$General)
14:11:17.083 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$General): calling childClassLoader().findClass() 
14:11:17.083 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$General): calling componentClassLoader.findClass()
14:11:17.083 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$General): calling componentClassLoader.loadClass()
14:11:17.084 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$General): class com.sun.jersey.json.impl.provider.entity.JSONJAXBElementProvider$General
14:11:17.085 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$App)
14:11:17.085 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$App): calling childClassLoader.findClass()
14:11:17.085 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$App)
14:11:17.085 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$App): calling childClassLoader().findClass() 
14:11:17.085 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$App): calling componentClassLoader.findClass()
14:11:17.085 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$App): calling componentClassLoader.loadClass()
14:11:17.086 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$App): class com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$App
14:11:17.087 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$General)
14:11:17.087 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$General): calling childClassLoader.findClass()
14:11:17.087 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$General)
14:11:17.087 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$General): calling childClassLoader().findClass() 
14:11:17.087 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$General): calling componentClassLoader.findClass()
14:11:17.087 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$General): calling componentClassLoader.loadClass()
14:11:17.088 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$General): class com.sun.jersey.json.impl.provider.entity.JSONListElementProvider$General
14:11:17.088 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$App)
14:11:17.088 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$App): calling childClassLoader.findClass()
14:11:17.088 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$App)
14:11:17.088 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$App): calling childClassLoader().findClass() 
14:11:17.088 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$App): calling componentClassLoader.findClass()
14:11:17.088 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$App): calling componentClassLoader.loadClass()
14:11:17.090 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$App): class com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$App
14:11:17.090 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$General)
14:11:17.090 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$General): calling childClassLoader.findClass()
14:11:17.090 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$General)
14:11:17.090 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$General): calling childClassLoader().findClass() 
14:11:17.090 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$General): calling componentClassLoader.findClass()
14:11:17.090 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$General): calling componentClassLoader.loadClass()
14:11:17.092 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$General): class com.sun.jersey.json.impl.provider.entity.JSONArrayProvider$General
14:11:17.092 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$App)
14:11:17.092 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$App): calling childClassLoader.findClass()
14:11:17.092 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$App)
14:11:17.092 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$App): calling childClassLoader().findClass() 
14:11:17.092 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$App): calling componentClassLoader.findClass()
14:11:17.092 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$App): calling componentClassLoader.loadClass()
14:11:17.094 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$App): class com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$App
14:11:17.094 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$General)
14:11:17.094 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$General): calling childClassLoader.findClass()
14:11:17.094 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$General)
14:11:17.094 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$General): calling childClassLoader().findClass() 
14:11:17.094 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$General): calling componentClassLoader.findClass()
14:11:17.094 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$General): calling componentClassLoader.loadClass()
14:11:17.095 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$General): class com.sun.jersey.json.impl.provider.entity.JSONObjectProvider$General
14:11:17.095 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JacksonProviderProxy)
14:11:17.095 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JacksonProviderProxy): calling childClassLoader.findClass()
14:11:17.095 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JacksonProviderProxy)
14:11:17.095 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JacksonProviderProxy): calling childClassLoader().findClass() 
14:11:17.095 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JacksonProviderProxy): calling componentClassLoader.findClass()
14:11:17.095 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JacksonProviderProxy): calling componentClassLoader.loadClass()
14:11:17.097 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JacksonProviderProxy): class com.sun.jersey.json.impl.provider.entity.JacksonProviderProxy
14:11:17.097 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetRootElementProvider)
14:11:17.097 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetRootElementProvider): calling childClassLoader.findClass()
14:11:17.097 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetRootElementProvider)
14:11:17.097 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetRootElementProvider): calling childClassLoader().findClass() 
14:11:17.097 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetRootElementProvider): calling componentClassLoader.findClass()
14:11:17.097 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetRootElementProvider): calling componentClassLoader.loadClass()
14:11:17.098 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetRootElementProvider): class com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetRootElementProvider
14:11:17.098 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetJAXBElementProvider)
14:11:17.098 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetJAXBElementProvider): calling childClassLoader.findClass()
14:11:17.098 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetJAXBElementProvider)
14:11:17.098 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetJAXBElementProvider): calling childClassLoader().findClass() 
14:11:17.098 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetJAXBElementProvider): calling componentClassLoader.findClass()
14:11:17.098 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetJAXBElementProvider): calling componentClassLoader.loadClass()
14:11:17.100 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetJAXBElementProvider): class com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetJAXBElementProvider
14:11:17.100 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetListElementProvider)
14:11:17.100 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetListElementProvider): calling childClassLoader.findClass()
14:11:17.100 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetListElementProvider)
14:11:17.100 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetListElementProvider): calling childClassLoader().findClass() 
14:11:17.100 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetListElementProvider): calling componentClassLoader.findClass()
14:11:17.100 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetListElementProvider): calling componentClassLoader.loadClass()
14:11:17.101 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetListElementProvider): class com.sun.jersey.fastinfoset.impl.provider.entity.FastInfosetListElementProvider
14:11:17.244 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/javax.ws.rs.ext.MessageBodyWriter) 
14:11:17.244 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyWriter): calling childClassLoader.findResources()
14:11:17.244 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyWriter): calling componentClassLoader.getResources()
14:11:17.245 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyWriter): java.lang.CompoundEnumeration@bfd06cd
14:11:17.245 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/javax.ws.rs.ext.MessageBodyWriter) 
14:11:17.247 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider)
14:11:17.247 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider): calling childClassLoader.findClass()
14:11:17.247 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider)
14:11:17.247 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider): calling childClassLoader().findClass() 
14:11:17.247 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider): calling componentClassLoader.findClass()
14:11:17.248 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider): calling componentClassLoader.loadClass()
14:11:17.249 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider): class com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider
14:11:17.249 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SourceWriter)
14:11:17.249 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SourceWriter): calling childClassLoader.findClass()
14:11:17.249 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SourceWriter)
14:11:17.249 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SourceWriter): calling childClassLoader().findClass() 
14:11:17.249 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SourceWriter): calling componentClassLoader.findClass()
14:11:17.249 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SourceWriter): calling componentClassLoader.loadClass()
14:11:17.251 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.core.impl.provider.entity.SourceProvider$SourceWriter): class com.sun.jersey.core.impl.provider.entity.SourceProvider$SourceWriter
14:11:17.251 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.server.impl.template.ViewableMessageBodyWriter)
14:11:17.251 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.server.impl.template.ViewableMessageBodyWriter): calling childClassLoader.findClass()
14:11:17.251 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.server.impl.template.ViewableMessageBodyWriter)
14:11:17.251 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.server.impl.template.ViewableMessageBodyWriter): calling childClassLoader().findClass() 
14:11:17.251 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.server.impl.template.ViewableMessageBodyWriter): calling componentClassLoader.findClass()
14:11:17.251 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.server.impl.template.ViewableMessageBodyWriter): calling componentClassLoader.loadClass()
14:11:17.252 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.server.impl.template.ViewableMessageBodyWriter): class com.sun.jersey.server.impl.template.ViewableMessageBodyWriter
14:11:17.252 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONWithPaddingProvider)
14:11:17.252 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONWithPaddingProvider): calling childClassLoader.findClass()
14:11:17.252 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONWithPaddingProvider)
14:11:17.252 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONWithPaddingProvider): calling childClassLoader().findClass() 
14:11:17.253 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.json.impl.provider.entity.JSONWithPaddingProvider): calling componentClassLoader.findClass()
14:11:17.253 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONWithPaddingProvider): calling componentClassLoader.loadClass()
14:11:17.254 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.json.impl.provider.entity.JSONWithPaddingProvider): class com.sun.jersey.json.impl.provider.entity.JSONWithPaddingProvider
14:11:17.292 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Map$Entry)
14:11:17.292 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Map$Entry): calling childClassLoader.findClass()
14:11:17.292 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Map$Entry): interface java.util.Map$Entry
14:11:17.292 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.WebResource)
14:11:17.292 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.WebResource): calling childClassLoader.findClass()
14:11:17.292 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.api.client.WebResource)
14:11:17.292 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.WebResource): calling childClassLoader().findClass() 
14:11:17.292 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.WebResource): calling componentClassLoader.findClass()
14:11:17.292 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.WebResource): calling componentClassLoader.loadClass()
14:11:17.293 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.WebResource): class com.sun.jersey.api.client.WebResource
14:11:17.299 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.WebResource$Builder)
14:11:17.299 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.WebResource$Builder): calling childClassLoader.findClass()
14:11:17.299 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.sun.jersey.api.client.WebResource$Builder)
14:11:17.299 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.WebResource$Builder): calling childClassLoader().findClass() 
14:11:17.299 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.sun.jersey.api.client.WebResource$Builder): calling componentClassLoader.findClass()
14:11:17.299 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.WebResource$Builder): calling componentClassLoader.loadClass()
14:11:17.300 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.sun.jersey.api.client.WebResource$Builder): class com.sun.jersey.api.client.WebResource$Builder
May 04 14:11:17 dist-test-slave-2x32 krb5kdc[6956](info): TGS_REQ (1 etypes {17}) 127.0.0.1: ISSUE: authtime 1777903876, etypes {rep=17 tkt=17 ses=17}, rangerkms/127.25.254.212@KRBTEST.COM for HTTP/127.25.254.212@KRBTEST.COM
14:11:17.592 [main] DEBUG org.apache.ranger.admin.client.RangerAdminRESTClient -- checkAndResetSessionCookie(): status=200, sessionIdCookie=null, newCookie=RANGERADMINSESSIONID=DC4C0D3577ED26C10D74B793A37C9255;Version=1;Path=/
14:11:17.592 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRoles)
14:11:17.592 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRoles): calling childClassLoader.findClass()
14:11:17.592 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRoles)
14:11:17.592 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRoles): calling childClassLoader().findClass() 
14:11:17.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRoles): class org.apache.ranger.plugin.util.RangerRoles
14:11:17.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRoles): class org.apache.ranger.plugin.util.RangerRoles
14:11:17.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.JsonUtilsV2)
14:11:17.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.JsonUtilsV2): calling childClassLoader.findClass()
14:11:17.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.JsonUtilsV2)
14:11:17.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.JsonUtilsV2): calling childClassLoader().findClass() 
14:11:17.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.JsonUtilsV2): class org.apache.ranger.plugin.util.JsonUtilsV2
14:11:17.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.JsonUtilsV2): class org.apache.ranger.plugin.util.JsonUtilsV2
14:11:17.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.JsonUtilsV2$1)
14:11:17.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.JsonUtilsV2$1): calling childClassLoader.findClass()
14:11:17.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.JsonUtilsV2$1)
14:11:17.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.JsonUtilsV2$1): calling childClassLoader().findClass() 
14:11:17.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.JsonUtilsV2$1): class org.apache.ranger.plugin.util.JsonUtilsV2$1
14:11:17.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.JsonUtilsV2$1): class org.apache.ranger.plugin.util.JsonUtilsV2$1
14:11:17.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.JsonUtilsV2$2)
14:11:17.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.JsonUtilsV2$2): calling childClassLoader.findClass()
14:11:17.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.JsonUtilsV2$2)
14:11:17.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.JsonUtilsV2$2): calling childClassLoader().findClass() 
14:11:17.593 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.JsonUtilsV2$2): class org.apache.ranger.plugin.util.JsonUtilsV2$2
14:11:17.594 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.JsonUtilsV2$2): class org.apache.ranger.plugin.util.JsonUtilsV2$2
14:11:17.600 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.databind.ObjectMapper)
14:11:17.601 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.databind.ObjectMapper): calling childClassLoader.findClass()
14:11:17.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.databind.ObjectMapper)
14:11:17.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.databind.ObjectMapper): calling childClassLoader().findClass() 
14:11:17.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.databind.ObjectMapper): calling componentClassLoader.findClass()
14:11:17.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.databind.ObjectMapper): calling componentClassLoader.loadClass()
14:11:17.604 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.databind.ObjectMapper): class com.fasterxml.jackson.databind.ObjectMapper
14:11:17.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonAutoDetect)
14:11:17.755 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonAutoDetect): calling childClassLoader.findClass()
14:11:17.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonAutoDetect)
14:11:17.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonAutoDetect): calling childClassLoader().findClass() 
14:11:17.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonAutoDetect): calling componentClassLoader.findClass()
14:11:17.756 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonAutoDetect): calling componentClassLoader.loadClass()
14:11:17.757 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonAutoDetect): interface com.fasterxml.jackson.annotation.JsonAutoDetect
14:11:17.758 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonAutoDetect$Visibility)
14:11:17.758 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonAutoDetect$Visibility): calling childClassLoader.findClass()
14:11:17.758 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonAutoDetect$Visibility)
14:11:17.758 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonAutoDetect$Visibility): calling childClassLoader().findClass() 
14:11:17.758 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonAutoDetect$Visibility): calling componentClassLoader.findClass()
14:11:17.758 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonAutoDetect$Visibility): calling componentClassLoader.loadClass()
14:11:17.759 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonAutoDetect$Visibility): class com.fasterxml.jackson.annotation.JsonAutoDetect$Visibility
14:11:17.761 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonInclude)
14:11:17.761 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonInclude): calling childClassLoader.findClass()
14:11:17.761 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonInclude)
14:11:17.761 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonInclude): calling childClassLoader().findClass() 
14:11:17.762 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonInclude): calling componentClassLoader.findClass()
14:11:17.762 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonInclude): calling componentClassLoader.loadClass()
14:11:17.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonInclude): interface com.fasterxml.jackson.annotation.JsonInclude
14:11:17.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonInclude$Include)
14:11:17.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonInclude$Include): calling childClassLoader.findClass()
14:11:17.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonInclude$Include)
14:11:17.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonInclude$Include): calling childClassLoader().findClass() 
14:11:17.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonInclude$Include): calling componentClassLoader.findClass()
14:11:17.763 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonInclude$Include): calling componentClassLoader.loadClass()
14:11:17.764 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonInclude$Include): class com.fasterxml.jackson.annotation.JsonInclude$Include
14:11:17.766 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonIgnoreProperties)
14:11:17.766 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonIgnoreProperties): calling childClassLoader.findClass()
14:11:17.766 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonIgnoreProperties)
14:11:17.766 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonIgnoreProperties): calling childClassLoader().findClass() 
14:11:17.766 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonIgnoreProperties): calling componentClassLoader.findClass()
14:11:17.766 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonIgnoreProperties): calling componentClassLoader.loadClass()
14:11:17.767 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonIgnoreProperties): interface com.fasterxml.jackson.annotation.JsonIgnoreProperties
14:11:17.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerRole)
14:11:17.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerRole): calling childClassLoader.findClass()
14:11:17.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerRole)
14:11:17.809 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerRole): calling childClassLoader().findClass() 
14:11:17.810 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerRole): class org.apache.ranger.plugin.model.RangerRole
14:11:17.810 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerRole): class org.apache.ranger.plugin.model.RangerRole
14:11:17.841 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerRole$RoleMember)
14:11:17.842 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerRole$RoleMember): calling childClassLoader.findClass()
14:11:17.842 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerRole$RoleMember)
14:11:17.842 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerRole$RoleMember): calling childClassLoader().findClass() 
14:11:17.842 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerRole$RoleMember): class org.apache.ranger.plugin.model.RangerRole$RoleMember
14:11:17.842 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerRole$RoleMember): class org.apache.ranger.plugin.model.RangerRole$RoleMember
14:11:17.860 [main] DEBUG org.apache.ranger.admin.client.RangerAdminRESTClient -- <== RangerAdminRESTClient.getRolesIfUpdated(-1, 0): 
14:11:17.860 [main] DEBUG org.apache.ranger.plugin.util.RangerRolesProvider -- ==> RangerRolesProvider(serviceName=kms).saveToCache()
14:11:17.860 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils)
14:11:17.860 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils): calling childClassLoader.findClass()
14:11:17.860 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils)
14:11:17.860 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils): calling childClassLoader().findClass() 
14:11:17.861 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils): class org.apache.ranger.authorization.utils.JsonUtils
14:11:17.861 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils): class org.apache.ranger.authorization.utils.JsonUtils
14:11:17.861 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$1)
14:11:17.861 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$1): calling childClassLoader.findClass()
14:11:17.861 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$1)
14:11:17.861 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$1): calling childClassLoader().findClass() 
14:11:17.861 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$1): class org.apache.ranger.authorization.utils.JsonUtils$1
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$1): class org.apache.ranger.authorization.utils.JsonUtils$1
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$2)
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$2): calling childClassLoader.findClass()
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$2)
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$2): calling childClassLoader().findClass() 
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$2): class org.apache.ranger.authorization.utils.JsonUtils$2
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$2): class org.apache.ranger.authorization.utils.JsonUtils$2
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$3)
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$3): calling childClassLoader.findClass()
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$3)
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$3): calling childClassLoader().findClass() 
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$3): class org.apache.ranger.authorization.utils.JsonUtils$3
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$3): class org.apache.ranger.authorization.utils.JsonUtils$3
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$4)
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$4): calling childClassLoader.findClass()
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$4)
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$4): calling childClassLoader().findClass() 
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$4): class org.apache.ranger.authorization.utils.JsonUtils$4
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$4): class org.apache.ranger.authorization.utils.JsonUtils$4
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$5)
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$5): calling childClassLoader.findClass()
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$5)
14:11:17.862 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$5): calling childClassLoader().findClass() 
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$5): class org.apache.ranger.authorization.utils.JsonUtils$5
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$5): class org.apache.ranger.authorization.utils.JsonUtils$5
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$6)
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$6): calling childClassLoader.findClass()
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$6)
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$6): calling childClassLoader().findClass() 
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$6): class org.apache.ranger.authorization.utils.JsonUtils$6
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$6): class org.apache.ranger.authorization.utils.JsonUtils$6
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$7)
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$7): calling childClassLoader.findClass()
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$7)
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$7): calling childClassLoader().findClass() 
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$7): class org.apache.ranger.authorization.utils.JsonUtils$7
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$7): class org.apache.ranger.authorization.utils.JsonUtils$7
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$8)
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$8): calling childClassLoader.findClass()
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$8)
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$8): calling childClassLoader().findClass() 
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$8): class org.apache.ranger.authorization.utils.JsonUtils$8
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$8): class org.apache.ranger.authorization.utils.JsonUtils$8
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$9)
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$9): calling childClassLoader.findClass()
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$9)
14:11:17.863 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$9): calling childClassLoader().findClass() 
14:11:17.864 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$9): class org.apache.ranger.authorization.utils.JsonUtils$9
14:11:17.864 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$9): class org.apache.ranger.authorization.utils.JsonUtils$9
14:11:17.864 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$10)
14:11:17.864 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$10): calling childClassLoader.findClass()
14:11:17.864 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$10)
14:11:17.864 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$10): calling childClassLoader().findClass() 
14:11:17.864 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$10): class org.apache.ranger.authorization.utils.JsonUtils$10
14:11:17.864 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$10): class org.apache.ranger.authorization.utils.JsonUtils$10
14:11:17.864 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$11)
14:11:17.864 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$11): calling childClassLoader.findClass()
14:11:17.864 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$11)
14:11:17.864 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$11): calling childClassLoader().findClass() 
14:11:17.864 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.utils.JsonUtils$11): class org.apache.ranger.authorization.utils.JsonUtils$11
14:11:17.864 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.utils.JsonUtils$11): class org.apache.ranger.authorization.utils.JsonUtils$11
14:11:17.865 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValiditySchedule)
14:11:17.865 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValiditySchedule): calling childClassLoader.findClass()
14:11:17.865 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValiditySchedule)
14:11:17.865 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValiditySchedule): calling childClassLoader().findClass() 
14:11:17.865 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValiditySchedule): class org.apache.ranger.plugin.model.RangerValiditySchedule
14:11:17.865 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValiditySchedule): class org.apache.ranger.plugin.model.RangerValiditySchedule
14:11:17.865 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.AuditFilter)
14:11:17.865 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.AuditFilter): calling childClassLoader.findClass()
14:11:17.865 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.AuditFilter)
14:11:17.865 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.AuditFilter): calling childClassLoader().findClass() 
14:11:17.865 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.AuditFilter): class org.apache.ranger.plugin.model.AuditFilter
14:11:17.866 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.AuditFilter): class org.apache.ranger.plugin.model.AuditFilter
14:11:17.866 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence)
14:11:17.866 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence): calling childClassLoader.findClass()
14:11:17.866 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence)
14:11:17.866 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence): calling childClassLoader().findClass() 
14:11:17.866 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence): class org.apache.ranger.plugin.model.RangerValidityRecurrence
14:11:17.866 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence): class org.apache.ranger.plugin.model.RangerValidityRecurrence
14:11:17.866 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPrincipal)
14:11:17.866 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPrincipal): calling childClassLoader.findClass()
14:11:17.866 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPrincipal)
14:11:17.866 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPrincipal): calling childClassLoader().findClass() 
14:11:17.866 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPrincipal): class org.apache.ranger.plugin.model.RangerPrincipal
14:11:17.866 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPrincipal): class org.apache.ranger.plugin.model.RangerPrincipal
14:11:17.866 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemDataMaskInfo)
14:11:17.866 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemDataMaskInfo): calling childClassLoader.findClass()
14:11:17.867 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemDataMaskInfo)
14:11:17.867 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemDataMaskInfo): calling childClassLoader().findClass() 
14:11:17.867 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemDataMaskInfo): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemDataMaskInfo
14:11:17.867 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemDataMaskInfo): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemDataMaskInfo
14:11:17.867 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyResource)
14:11:17.867 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyResource): calling childClassLoader.findClass()
14:11:17.867 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyResource)
14:11:17.867 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyResource): calling childClassLoader().findClass() 
14:11:17.867 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyResource): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyResource
14:11:17.867 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyResource): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyResource
14:11:17.868 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerTag)
14:11:17.868 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerTag): calling childClassLoader.findClass()
14:11:17.868 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerTag)
14:11:17.868 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerTag): calling childClassLoader().findClass() 
14:11:17.868 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerTag): class org.apache.ranger.plugin.model.RangerTag
14:11:17.868 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerTag): class org.apache.ranger.plugin.model.RangerTag
14:11:17.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.core.JsonParser$Feature)
14:11:17.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.core.JsonParser$Feature): calling childClassLoader.findClass()
14:11:17.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.core.JsonParser$Feature)
14:11:17.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.core.JsonParser$Feature): calling childClassLoader().findClass() 
14:11:17.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.core.JsonParser$Feature): calling componentClassLoader.findClass()
14:11:17.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.core.JsonParser$Feature): calling componentClassLoader.loadClass()
14:11:17.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.core.JsonParser$Feature): class com.fasterxml.jackson.core.JsonParser$Feature
14:11:17.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.databind.DeserializationFeature)
14:11:17.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.databind.DeserializationFeature): calling childClassLoader.findClass()
14:11:17.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.databind.DeserializationFeature)
14:11:17.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.databind.DeserializationFeature): calling childClassLoader().findClass() 
14:11:17.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.databind.DeserializationFeature): calling componentClassLoader.findClass()
14:11:17.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.databind.DeserializationFeature): calling componentClassLoader.loadClass()
14:11:17.869 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.databind.DeserializationFeature): class com.fasterxml.jackson.databind.DeserializationFeature
14:11:17.894 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.Math)
14:11:17.894 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.Math): calling childClassLoader.findClass()
14:11:17.894 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.Math): class java.lang.Math
14:11:17.894 [main] DEBUG org.apache.ranger.perf.policyengine.init -- [PERF]:main:RangerRolesProvider.saveToCache(serviceName=kms):33999515:34099008
14:11:17.894 [main] DEBUG org.apache.ranger.plugin.util.RangerRolesProvider -- <== RangerRolesProvider.saveToCache(serviceName=kms)
14:11:17.894 [main] INFO org.apache.ranger.plugin.util.RangerRolesProvider -- RangerRolesProvider(serviceName=kms): found updated version. lastKnownRoleVersion=-1; newVersion=1
14:11:17.894 [main] DEBUG org.apache.ranger.perf.policyengine.init -- [PERF]:main:RangerRolesProvider.loadUserGroupRolesFromAdmin(serviceName=kms):798271033:1031332193
14:11:17.894 [main] DEBUG org.apache.ranger.plugin.util.RangerRolesProvider -- <== RangerRolesProvider(serviceName=kms serviceType= kms ).loadUserGroupRolesFromAdmin()
14:11:17.894 [main] DEBUG org.apache.ranger.perf.policyengine.init -- In-Use memory: 160860120, Free memory:153712680
14:11:17.894 [main] DEBUG org.apache.ranger.perf.policyengine.init -- [PERF]:main:RangerRolesProvider.loadUserGroupRoles(serviceName=kms):798461000:1031516278
14:11:17.894 [main] DEBUG org.apache.ranger.plugin.util.RangerRolesProvider -- <== RangerRolesProvider(serviceName=kms).loadUserGroupRoles()
14:11:17.894 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- <== PolicyRefresher(serviceName=kms).loadRoles()
14:11:17.894 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- ==> PolicyRefresher(serviceName=kms).loadPolicy()
14:11:17.894 [main] DEBUG org.apache.ranger.perf.policyengine.init -- In-Use memory: 160860120, Free memory:153712680
14:11:17.894 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- ==> PolicyRefresher(serviceName=kms).loadPolicyfromPolicyAdmin()
14:11:17.895 [main] DEBUG org.apache.ranger.admin.client.RangerAdminRESTClient -- ==> RangerAdminRESTClient.getServicePoliciesIfUpdated(-1, 0)
14:11:17.895 [main] DEBUG org.apache.ranger.admin.client.RangerAdminRESTClient -- Checking Service policy if updated as user : rangerkms/127.25.254.212@KRBTEST.COM (auth:KERBEROS)
14:11:17.895 [main] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: rangerkms/127.25.254.212@KRBTEST.COM (auth:KERBEROS)][action: org.apache.ranger.admin.client.RangerAdminRESTClient$$Lambda$162/0x00007f3d08362bc8@62f62db4]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.ranger.audit.provider.MiscUtil.executePrivilegedAction(MiscUtil.java:560)
	at org.apache.ranger.admin.client.RangerAdminRESTClient.getServicePoliciesIfUpdated(RangerAdminRESTClient.java:137)
	at org.apache.ranger.plugin.util.PolicyRefresher.loadPolicyfromPolicyAdmin(PolicyRefresher.java:302)
	at org.apache.ranger.plugin.util.PolicyRefresher.loadPolicy(PolicyRefresher.java:241)
	at org.apache.ranger.plugin.util.PolicyRefresher.startRefresher(PolicyRefresher.java:139)
	at org.apache.ranger.plugin.service.RangerBasePlugin.init(RangerBasePlugin.java:310)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSPlugin.init(RangerKmsAuthorizer.java:346)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.init(RangerKmsAuthorizer.java:303)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.<init>(RangerKmsAuthorizer.java:127)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.<init>(RangerKmsAuthorizer.java:153)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500)
	at java.base/java.lang.reflect.ReflectAccess.newInstance(ReflectAccess.java:128)
	at java.base/jdk.internal.reflect.ReflectionFactory.newInstance(ReflectionFactory.java:347)
	at java.base/java.lang.Class.newInstance(Class.java:647)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.init(RangerKmsAuthorizer.java:70)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.<init>(RangerKmsAuthorizer.java:50)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500)
	at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:481)
	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
	at org.apache.hadoop.crypto.key.kms.server.KMSWebApp.getKeyAcls(KMSWebApp.java:254)
	at org.apache.hadoop.crypto.key.kms.server.KMSWebApp.contextInitialized(KMSWebApp.java:143)
	at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4018)
	at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:4460)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1203)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1193)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:145)
	at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:749)
	at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:721)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1203)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1193)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:145)
	at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:749)
	at org.apache.catalina.core.StandardEngine.startInternal(StandardEngine.java:211)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.StandardService.startInternal(StandardService.java:415)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.StandardServer.startInternal(StandardServer.java:874)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.startup.Tomcat.start(Tomcat.java:438)
	at org.apache.ranger.server.tomcat.EmbeddedServer.startServer(EmbeddedServer.java:351)
	at org.apache.ranger.server.tomcat.EmbeddedServer.start(EmbeddedServer.java:317)
	at org.apache.ranger.server.tomcat.EmbeddedServer.main(EmbeddedServer.java:95)
14:11:17.896 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/jersey-client-components) 
14:11:17.896 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/jersey-client-components): calling childClassLoader.findResources()
14:11:17.896 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/jersey-client-components): calling componentClassLoader.getResources()
14:11:17.896 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/jersey-client-components): java.lang.CompoundEnumeration@7162669e
14:11:17.896 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/jersey-client-components) 
14:11:17.897 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider) 
14:11:17.897 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider): calling childClassLoader.findResources()
14:11:17.897 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider): calling componentClassLoader.getResources()
14:11:17.897 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider): java.lang.CompoundEnumeration@46be63cc
14:11:17.897 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.client.proxy.ViewProxyProvider) 
14:11:17.897 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider) 
14:11:17.897 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider): calling childClassLoader.findResources()
14:11:17.897 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider): calling componentClassLoader.getResources()
14:11:17.897 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider): java.lang.CompoundEnumeration@1392d14f
14:11:17.897 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/com.sun.jersey.spi.inject.InjectableProvider) 
14:11:17.910 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/javax.ws.rs.ext.MessageBodyReader) 
14:11:17.910 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyReader): calling childClassLoader.findResources()
14:11:17.910 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyReader): calling componentClassLoader.getResources()
14:11:17.910 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyReader): java.lang.CompoundEnumeration@4970a434
14:11:17.910 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/javax.ws.rs.ext.MessageBodyReader) 
14:11:18.011 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findResources(META-INF/services/javax.ws.rs.ext.MessageBodyWriter) 
14:11:18.011 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingChildClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyWriter): calling childClassLoader.findResources()
14:11:18.011 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyWriter): calling componentClassLoader.getResources()
14:11:18.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResourcesUsingComponentClassLoader(META-INF/services/javax.ws.rs.ext.MessageBodyWriter): java.lang.CompoundEnumeration@4c7bd8b5
14:11:18.012 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findResources(META-INF/services/javax.ws.rs.ext.MessageBodyWriter) 
14:11:18.172 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServicePolicies)
14:11:18.172 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServicePolicies): calling childClassLoader.findClass()
14:11:18.172 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServicePolicies)
14:11:18.172 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServicePolicies): calling childClassLoader().findClass() 
14:11:18.173 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServicePolicies): class org.apache.ranger.plugin.util.ServicePolicies
14:11:18.173 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServicePolicies): class org.apache.ranger.plugin.util.ServicePolicies
14:11:18.187 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef)
14:11:18.188 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef): calling childClassLoader.findClass()
14:11:18.188 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef)
14:11:18.188 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef): calling childClassLoader().findClass() 
14:11:18.188 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef): class org.apache.ranger.plugin.model.RangerServiceDef
14:11:18.189 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef): class org.apache.ranger.plugin.model.RangerServiceDef
14:11:18.189 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServicePolicies$TagPolicies)
14:11:18.189 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServicePolicies$TagPolicies): calling childClassLoader.findClass()
14:11:18.189 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServicePolicies$TagPolicies)
14:11:18.189 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServicePolicies$TagPolicies): calling childClassLoader().findClass() 
14:11:18.189 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServicePolicies$TagPolicies): class org.apache.ranger.plugin.util.ServicePolicies$TagPolicies
14:11:18.189 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServicePolicies$TagPolicies): class org.apache.ranger.plugin.util.ServicePolicies$TagPolicies
14:11:18.190 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl)
14:11:18.190 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl): calling childClassLoader.findClass()
14:11:18.190 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl)
14:11:18.190 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl): calling childClassLoader().findClass() 
14:11:18.191 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl): class org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl
14:11:18.191 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl): class org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl
14:11:18.192 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy)
14:11:18.192 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy): calling childClassLoader.findClass()
14:11:18.192 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy)
14:11:18.192 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy): calling childClassLoader().findClass() 
14:11:18.193 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy): class org.apache.ranger.plugin.model.RangerPolicy
14:11:18.193 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy): class org.apache.ranger.plugin.model.RangerPolicy
14:11:18.194 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServicePolicies$SecurityZoneInfo)
14:11:18.194 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServicePolicies$SecurityZoneInfo): calling childClassLoader.findClass()
14:11:18.194 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServicePolicies$SecurityZoneInfo)
14:11:18.194 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServicePolicies$SecurityZoneInfo): calling childClassLoader().findClass() 
14:11:18.194 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServicePolicies$SecurityZoneInfo): class org.apache.ranger.plugin.util.ServicePolicies$SecurityZoneInfo
14:11:18.194 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServicePolicies$SecurityZoneInfo): class org.apache.ranger.plugin.util.ServicePolicies$SecurityZoneInfo
14:11:18.195 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicyDelta)
14:11:18.195 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicyDelta): calling childClassLoader.findClass()
14:11:18.195 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicyDelta)
14:11:18.195 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicyDelta): calling childClassLoader().findClass() 
14:11:18.196 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicyDelta): class org.apache.ranger.plugin.model.RangerPolicyDelta
14:11:18.196 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicyDelta): class org.apache.ranger.plugin.model.RangerPolicyDelta
14:11:18.202 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskDef)
14:11:18.202 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskDef): calling childClassLoader.findClass()
14:11:18.202 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskDef)
14:11:18.202 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskDef): calling childClassLoader().findClass() 
14:11:18.203 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskDef
14:11:18.203 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskDef
14:11:18.203 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerRowFilterDef)
14:11:18.203 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerRowFilterDef): calling childClassLoader.findClass()
14:11:18.203 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerRowFilterDef)
14:11:18.203 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerRowFilterDef): calling childClassLoader().findClass() 
14:11:18.203 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerRowFilterDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerRowFilterDef
14:11:18.203 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerRowFilterDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerRowFilterDef
14:11:18.204 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerServiceConfigDef)
14:11:18.204 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerServiceConfigDef): calling childClassLoader.findClass()
14:11:18.204 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerServiceConfigDef)
14:11:18.204 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerServiceConfigDef): calling childClassLoader().findClass() 
14:11:18.205 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerServiceConfigDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerServiceConfigDef
14:11:18.205 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerServiceConfigDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerServiceConfigDef
14:11:18.205 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerResourceDef)
14:11:18.205 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerResourceDef): calling childClassLoader.findClass()
14:11:18.205 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerResourceDef)
14:11:18.205 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerResourceDef): calling childClassLoader().findClass() 
14:11:18.206 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerResourceDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerResourceDef
14:11:18.206 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerResourceDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerResourceDef
14:11:18.206 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef)
14:11:18.206 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef): calling childClassLoader.findClass()
14:11:18.206 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef)
14:11:18.206 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef): calling childClassLoader().findClass() 
14:11:18.206 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef
14:11:18.206 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef
14:11:18.206 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerPolicyConditionDef)
14:11:18.206 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerPolicyConditionDef): calling childClassLoader.findClass()
14:11:18.206 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerPolicyConditionDef)
14:11:18.206 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerPolicyConditionDef): calling childClassLoader().findClass() 
14:11:18.207 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerPolicyConditionDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerPolicyConditionDef
14:11:18.207 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerPolicyConditionDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerPolicyConditionDef
14:11:18.207 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerContextEnricherDef)
14:11:18.207 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerContextEnricherDef): calling childClassLoader.findClass()
14:11:18.207 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerContextEnricherDef)
14:11:18.207 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerContextEnricherDef): calling childClassLoader().findClass() 
14:11:18.207 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerContextEnricherDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerContextEnricherDef
14:11:18.207 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerContextEnricherDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerContextEnricherDef
14:11:18.207 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumDef)
14:11:18.207 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumDef): calling childClassLoader.findClass()
14:11:18.207 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumDef)
14:11:18.207 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumDef): calling childClassLoader().findClass() 
14:11:18.213 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumDef
14:11:18.213 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumDef
14:11:18.222 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskTypeDef)
14:11:18.222 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskTypeDef): calling childClassLoader.findClass()
14:11:18.222 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskTypeDef)
14:11:18.222 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskTypeDef): calling childClassLoader().findClass() 
14:11:18.222 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskTypeDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskTypeDef
14:11:18.222 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskTypeDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerDataMaskTypeDef
14:11:18.227 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef$AccessTypeCategory)
14:11:18.227 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef$AccessTypeCategory): calling childClassLoader.findClass()
14:11:18.227 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef$AccessTypeCategory)
14:11:18.227 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef$AccessTypeCategory): calling childClassLoader().findClass() 
14:11:18.227 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef$AccessTypeCategory): class org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef$AccessTypeCategory
14:11:18.227 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef$AccessTypeCategory): class org.apache.ranger.plugin.model.RangerServiceDef$RangerAccessTypeDef$AccessTypeCategory
14:11:18.265 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumElementDef)
14:11:18.265 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumElementDef): calling childClassLoader.findClass()
14:11:18.265 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumElementDef)
14:11:18.265 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumElementDef): calling childClassLoader().findClass() 
14:11:18.266 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumElementDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumElementDef
14:11:18.267 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumElementDef): class org.apache.ranger.plugin.model.RangerServiceDef$RangerEnumElementDef
14:11:18.283 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItem)
14:11:18.283 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItem): calling childClassLoader.findClass()
14:11:18.283 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItem)
14:11:18.283 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItem): calling childClassLoader().findClass() 
14:11:18.284 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItem): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItem
14:11:18.284 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItem): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItem
14:11:18.284 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerDataMaskPolicyItem)
14:11:18.284 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerDataMaskPolicyItem): calling childClassLoader.findClass()
14:11:18.284 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerDataMaskPolicyItem)
14:11:18.285 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerDataMaskPolicyItem): calling childClassLoader().findClass() 
14:11:18.285 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerDataMaskPolicyItem): class org.apache.ranger.plugin.model.RangerPolicy$RangerDataMaskPolicyItem
14:11:18.285 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerDataMaskPolicyItem): class org.apache.ranger.plugin.model.RangerPolicy$RangerDataMaskPolicyItem
14:11:18.285 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerRowFilterPolicyItem)
14:11:18.285 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerRowFilterPolicyItem): calling childClassLoader.findClass()
14:11:18.285 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerRowFilterPolicyItem)
14:11:18.285 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerRowFilterPolicyItem): calling childClassLoader().findClass() 
14:11:18.285 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerRowFilterPolicyItem): class org.apache.ranger.plugin.model.RangerPolicy$RangerRowFilterPolicyItem
14:11:18.286 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerRowFilterPolicyItem): class org.apache.ranger.plugin.model.RangerPolicy$RangerRowFilterPolicyItem
14:11:18.286 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemCondition)
14:11:18.286 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemCondition): calling childClassLoader.findClass()
14:11:18.286 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemCondition)
14:11:18.286 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemCondition): calling childClassLoader().findClass() 
14:11:18.286 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemCondition): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemCondition
14:11:18.286 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemCondition): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemCondition
14:11:18.294 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemAccess)
14:11:18.298 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemAccess): calling childClassLoader.findClass()
14:11:18.298 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemAccess)
14:11:18.298 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemAccess): calling childClassLoader().findClass() 
14:11:18.299 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemAccess): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemAccess
14:11:18.299 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemAccess): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemAccess
14:11:18.310 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemRowFilterInfo)
14:11:18.314 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemRowFilterInfo): calling childClassLoader.findClass()
14:11:18.314 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemRowFilterInfo)
14:11:18.314 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemRowFilterInfo): calling childClassLoader().findClass() 
14:11:18.314 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemRowFilterInfo): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemRowFilterInfo
14:11:18.314 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemRowFilterInfo): class org.apache.ranger.plugin.model.RangerPolicy$RangerPolicyItemRowFilterInfo
14:11:18.323 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule)
14:11:18.323 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule): calling childClassLoader.findClass()
14:11:18.323 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule)
14:11:18.323 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule): calling childClassLoader().findClass() 
14:11:18.324 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule): class org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule
14:11:18.324 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule): class org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule
14:11:18.324 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$ValidityInterval)
14:11:18.324 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$ValidityInterval): calling childClassLoader.findClass()
14:11:18.324 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$ValidityInterval)
14:11:18.324 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$ValidityInterval): calling childClassLoader().findClass() 
14:11:18.324 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$ValidityInterval): class org.apache.ranger.plugin.model.RangerValidityRecurrence$ValidityInterval
14:11:18.324 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$ValidityInterval): class org.apache.ranger.plugin.model.RangerValidityRecurrence$ValidityInterval
14:11:18.326 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule$ScheduleFieldSpec)
14:11:18.327 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule$ScheduleFieldSpec): calling childClassLoader.findClass()
14:11:18.327 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule$ScheduleFieldSpec)
14:11:18.327 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule$ScheduleFieldSpec): calling childClassLoader().findClass() 
14:11:18.327 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule$ScheduleFieldSpec): class org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule$ScheduleFieldSpec
14:11:18.327 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule$ScheduleFieldSpec): class org.apache.ranger.plugin.model.RangerValidityRecurrence$RecurrenceSchedule$ScheduleFieldSpec
14:11:18.336 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonIgnore)
14:11:18.336 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonIgnore): calling childClassLoader.findClass()
14:11:18.336 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonIgnore)
14:11:18.336 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonIgnore): calling childClassLoader().findClass() 
14:11:18.336 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(com.fasterxml.jackson.annotation.JsonIgnore): calling componentClassLoader.findClass()
14:11:18.336 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonIgnore): calling componentClassLoader.loadClass()
14:11:18.337 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(com.fasterxml.jackson.annotation.JsonIgnore): interface com.fasterxml.jackson.annotation.JsonIgnore
14:11:18.341 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$PolicyIdComparator)
14:11:18.341 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$PolicyIdComparator): calling childClassLoader.findClass()
14:11:18.341 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$PolicyIdComparator)
14:11:18.341 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$PolicyIdComparator): calling childClassLoader().findClass() 
14:11:18.341 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.RangerPolicy$PolicyIdComparator): class org.apache.ranger.plugin.model.RangerPolicy$PolicyIdComparator
14:11:18.341 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.RangerPolicy$PolicyIdComparator): class org.apache.ranger.plugin.model.RangerPolicy$PolicyIdComparator
14:11:18.343 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.HashSet)
14:11:18.343 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.HashSet): calling childClassLoader.findClass()
14:11:18.344 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.HashSet): class java.util.HashSet
14:11:18.346 [main] DEBUG org.apache.ranger.admin.client.RangerAdminRESTClient -- <== RangerAdminRESTClient.getServicePoliciesIfUpdated(-1, 0): serviceName=kms, serviceId=3, policyVersion=4, policyUpdateTime=Mon May 04 14:11:01 UTC 2026, policies=[RangerPolicy={id={3} guid={1d0269a7-54a8-4091-8f87-b7e039293eab} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}], tagPolicies=null, policyDeltas=null, serviceDef=RangerServiceDef={id={7} guid={025a584e-60a1-4f68-b6e0-9926060801e1} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:10:43 UTC 2026} updateTime={Mon May 04 14:10:43 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }, auditMode=audit-default, securityZones=null
14:11:18.346 [main] INFO org.apache.ranger.plugin.util.PolicyRefresher -- PolicyRefresher(serviceName=kms): found updated version. lastKnownVersion=-1; newVersion=4
14:11:18.346 [main] DEBUG org.apache.ranger.perf.policyengine.init -- [PERF]:main:PolicyRefresher.loadPolicyFromPolicyAdmin(serviceName=kms):232230756:451956821
14:11:18.346 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- <== PolicyRefresher(serviceName=kms).loadPolicyfromPolicyAdmin()
14:11:18.347 [main] DEBUG org.apache.ranger.perf.policyengine.init -- In-Use memory: 183931512, Free memory:130641288
14:11:18.347 [main] DEBUG org.apache.ranger.plugin.service.RangerBasePlugin -- ==> setPolicies(serviceName=kms, serviceId=3, policyVersion=4, policyUpdateTime=Mon May 04 14:11:01 UTC 2026, policies=[RangerPolicy={id={3} guid={1d0269a7-54a8-4091-8f87-b7e039293eab} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}], tagPolicies=null, policyDeltas=null, serviceDef=RangerServiceDef={id={7} guid={025a584e-60a1-4f68-b6e0-9926060801e1} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:10:43 UTC 2026} updateTime={Mon May 04 14:10:43 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }, auditMode=audit-default, securityZones=null)
14:11:18.347 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.MapUtils)
14:11:18.347 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.MapUtils): calling childClassLoader.findClass()
14:11:18.347 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.MapUtils)
14:11:18.347 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.MapUtils): calling childClassLoader().findClass() 
14:11:18.348 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.MapUtils): class org.apache.commons.collections.MapUtils
14:11:18.348 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.MapUtils): class org.apache.commons.collections.MapUtils
14:11:18.348 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.SortedMap)
14:11:18.348 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.SortedMap): calling childClassLoader.findClass()
14:11:18.348 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.SortedMap): interface java.util.SortedMap
14:11:18.348 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.UnmodifiableMap)
14:11:18.348 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.UnmodifiableMap): calling childClassLoader.findClass()
14:11:18.348 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.map.UnmodifiableMap)
14:11:18.348 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.map.UnmodifiableMap): calling childClassLoader().findClass() 
14:11:18.349 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.IterableMap)
14:11:18.349 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.IterableMap): calling childClassLoader.findClass()
14:11:18.349 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.IterableMap)
14:11:18.349 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.IterableMap): calling childClassLoader().findClass() 
14:11:18.349 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.IterableMap): interface org.apache.commons.collections.IterableMap
14:11:18.349 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.IterableMap): interface org.apache.commons.collections.IterableMap
14:11:18.349 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.AbstractMapDecorator)
14:11:18.349 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.AbstractMapDecorator): calling childClassLoader.findClass()
14:11:18.349 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.map.AbstractMapDecorator)
14:11:18.349 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.map.AbstractMapDecorator): calling childClassLoader().findClass() 
14:11:18.349 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.map.AbstractMapDecorator): class org.apache.commons.collections.map.AbstractMapDecorator
14:11:18.350 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.AbstractMapDecorator): class org.apache.commons.collections.map.AbstractMapDecorator
14:11:18.350 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.map.UnmodifiableMap): class org.apache.commons.collections.map.UnmodifiableMap
14:11:18.350 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.UnmodifiableMap): class org.apache.commons.collections.map.UnmodifiableMap
14:11:18.350 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.MapIterator)
14:11:18.350 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.MapIterator): calling childClassLoader.findClass()
14:11:18.350 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.MapIterator)
14:11:18.350 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.MapIterator): calling childClassLoader().findClass() 
14:11:18.350 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.MapIterator): interface org.apache.commons.collections.MapIterator
14:11:18.350 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.MapIterator): interface org.apache.commons.collections.MapIterator
14:11:18.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.TreeMap)
14:11:18.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.TreeMap): calling childClassLoader.findClass()
14:11:18.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.TreeMap): class java.util.TreeMap
14:11:18.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.UnmodifiableSortedMap)
14:11:18.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.UnmodifiableSortedMap): calling childClassLoader.findClass()
14:11:18.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.map.UnmodifiableSortedMap)
14:11:18.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.map.UnmodifiableSortedMap): calling childClassLoader().findClass() 
14:11:18.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.AbstractSortedMapDecorator)
14:11:18.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.AbstractSortedMapDecorator): calling childClassLoader.findClass()
14:11:18.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.commons.collections.map.AbstractSortedMapDecorator)
14:11:18.351 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.commons.collections.map.AbstractSortedMapDecorator): calling childClassLoader().findClass() 
14:11:18.352 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.map.AbstractSortedMapDecorator): class org.apache.commons.collections.map.AbstractSortedMapDecorator
14:11:18.352 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.AbstractSortedMapDecorator): class org.apache.commons.collections.map.AbstractSortedMapDecorator
14:11:18.352 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.commons.collections.map.UnmodifiableSortedMap): class org.apache.commons.collections.map.UnmodifiableSortedMap
14:11:18.352 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.commons.collections.map.UnmodifiableSortedMap): class org.apache.commons.collections.map.UnmodifiableSortedMap
14:11:18.352 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPolicyDeltaUtil)
14:11:18.352 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPolicyDeltaUtil): calling childClassLoader.findClass()
14:11:18.352 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPolicyDeltaUtil)
14:11:18.352 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPolicyDeltaUtil): calling childClassLoader().findClass() 
14:11:18.353 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerPolicyDeltaUtil): class org.apache.ranger.plugin.util.RangerPolicyDeltaUtil
14:11:18.353 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerPolicyDeltaUtil): class org.apache.ranger.plugin.util.RangerPolicyDeltaUtil
14:11:18.353 [main] DEBUG org.apache.ranger.plugin.util.RangerPolicyDeltaUtil -- ==> hasPolicyDeltas(servicePolicies:[serviceName=kms, serviceId=3, policyVersion=4, policyUpdateTime=Mon May 04 14:11:01 UTC 2026, policies=[RangerPolicy={id={3} guid={1d0269a7-54a8-4091-8f87-b7e039293eab} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}], tagPolicies=null, policyDeltas=null, serviceDef=RangerServiceDef={id={7} guid={025a584e-60a1-4f68-b6e0-9926060801e1} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:10:43 UTC 2026} updateTime={Mon May 04 14:10:43 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }, auditMode=audit-default, securityZones=null]
14:11:18.353 [main] DEBUG org.apache.ranger.plugin.util.RangerPolicyDeltaUtil -- <== hasPolicyDeltas(servicePolicies:[serviceName=kms, serviceId=3, policyVersion=4, policyUpdateTime=Mon May 04 14:11:01 UTC 2026, policies=[RangerPolicy={id={3} guid={1d0269a7-54a8-4091-8f87-b7e039293eab} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}], tagPolicies=null, policyDeltas=null, serviceDef=RangerServiceDef={id={7} guid={025a584e-60a1-4f68-b6e0-9926060801e1} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:10:43 UTC 2026} updateTime={Mon May 04 14:10:43 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }, auditMode=audit-default, securityZones=null], ret:[false]
14:11:18.354 [main] DEBUG org.apache.ranger.plugin.service.RangerBasePlugin -- Creating engine from policies
14:11:18.354 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestProcessor)
14:11:18.354 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestProcessor): calling childClassLoader.findClass()
14:11:18.354 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestProcessor)
14:11:18.354 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestProcessor): calling childClassLoader().findClass() 
14:11:18.354 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestProcessor): interface org.apache.ranger.plugin.policyengine.RangerAccessRequestProcessor
14:11:18.354 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestProcessor): interface org.apache.ranger.plugin.policyengine.RangerAccessRequestProcessor
14:11:18.355 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.PolicyEngine)
14:11:18.355 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.PolicyEngine): calling childClassLoader.findClass()
14:11:18.355 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.PolicyEngine)
14:11:18.355 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.PolicyEngine): calling childClassLoader().findClass() 
14:11:18.356 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.PolicyEngine): class org.apache.ranger.plugin.policyengine.PolicyEngine
14:11:18.356 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.PolicyEngine): class org.apache.ranger.plugin.policyengine.PolicyEngine
14:11:18.357 [main] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- ==> PolicyEngine(, serviceName=kms, serviceId=3, policyVersion=4, policyUpdateTime=Mon May 04 14:11:01 UTC 2026, policies=[RangerPolicy={id={3} guid={1d0269a7-54a8-4091-8f87-b7e039293eab} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}], tagPolicies=null, policyDeltas=null, serviceDef=RangerServiceDef={id={7} guid={025a584e-60a1-4f68-b6e0-9926060801e1} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:10:43 UTC 2026} updateTime={Mon May 04 14:10:43 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }, auditMode=audit-default, securityZones=null, org.apache.ranger.plugin.policyengine.RangerPluginContext@1935fb3e)
14:11:18.357 [main] DEBUG org.apache.ranger.perf.policyengine.init -- In-Use memory: 184977368, Free memory:129595432
14:11:18.357 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServiceDefUtil)
14:11:18.357 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServiceDefUtil): calling childClassLoader.findClass()
14:11:18.357 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServiceDefUtil)
14:11:18.357 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServiceDefUtil): calling childClassLoader().findClass() 
14:11:18.358 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.ServiceDefUtil): class org.apache.ranger.plugin.util.ServiceDefUtil
14:11:18.358 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.ServiceDefUtil): class org.apache.ranger.plugin.util.ServiceDefUtil
14:11:18.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator)
14:11:18.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator): calling childClassLoader.findClass()
14:11:18.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator)
14:11:18.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator): calling childClassLoader().findClass() 
14:11:18.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.conditionevaluator.RangerAbstractConditionEvaluator)
14:11:18.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.conditionevaluator.RangerAbstractConditionEvaluator): calling childClassLoader.findClass()
14:11:18.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.conditionevaluator.RangerAbstractConditionEvaluator)
14:11:18.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.conditionevaluator.RangerAbstractConditionEvaluator): calling childClassLoader().findClass() 
14:11:18.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.conditionevaluator.RangerConditionEvaluator)
14:11:18.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.conditionevaluator.RangerConditionEvaluator): calling childClassLoader.findClass()
14:11:18.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.conditionevaluator.RangerConditionEvaluator)
14:11:18.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.conditionevaluator.RangerConditionEvaluator): calling childClassLoader().findClass() 
14:11:18.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.conditionevaluator.RangerConditionEvaluator): interface org.apache.ranger.plugin.conditionevaluator.RangerConditionEvaluator
14:11:18.359 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.conditionevaluator.RangerConditionEvaluator): interface org.apache.ranger.plugin.conditionevaluator.RangerConditionEvaluator
14:11:18.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.conditionevaluator.RangerAbstractConditionEvaluator): class org.apache.ranger.plugin.conditionevaluator.RangerAbstractConditionEvaluator
14:11:18.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.conditionevaluator.RangerAbstractConditionEvaluator): class org.apache.ranger.plugin.conditionevaluator.RangerAbstractConditionEvaluator
14:11:18.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator): class org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator
14:11:18.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator): class org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator
14:11:18.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.contextenricher.RangerUserStoreEnricher)
14:11:18.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.contextenricher.RangerUserStoreEnricher): calling childClassLoader.findClass()
14:11:18.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.contextenricher.RangerUserStoreEnricher)
14:11:18.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.contextenricher.RangerUserStoreEnricher): calling childClassLoader().findClass() 
14:11:18.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.contextenricher.RangerAbstractContextEnricher)
14:11:18.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.contextenricher.RangerAbstractContextEnricher): calling childClassLoader.findClass()
14:11:18.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.contextenricher.RangerAbstractContextEnricher)
14:11:18.360 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.contextenricher.RangerAbstractContextEnricher): calling childClassLoader().findClass() 
14:11:18.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.contextenricher.RangerContextEnricher)
14:11:18.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.contextenricher.RangerContextEnricher): calling childClassLoader.findClass()
14:11:18.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.contextenricher.RangerContextEnricher)
14:11:18.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.contextenricher.RangerContextEnricher): calling childClassLoader().findClass() 
14:11:18.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.contextenricher.RangerContextEnricher): interface org.apache.ranger.plugin.contextenricher.RangerContextEnricher
14:11:18.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.contextenricher.RangerContextEnricher): interface org.apache.ranger.plugin.contextenricher.RangerContextEnricher
14:11:18.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.contextenricher.RangerAbstractContextEnricher): class org.apache.ranger.plugin.contextenricher.RangerAbstractContextEnricher
14:11:18.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.contextenricher.RangerAbstractContextEnricher): class org.apache.ranger.plugin.contextenricher.RangerAbstractContextEnricher
14:11:18.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.contextenricher.RangerUserStoreEnricher): class org.apache.ranger.plugin.contextenricher.RangerUserStoreEnricher
14:11:18.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.contextenricher.RangerUserStoreEnricher): class org.apache.ranger.plugin.contextenricher.RangerUserStoreEnricher
14:11:18.361 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPluginContext -- ==> cleanResourceMatchers()
14:11:18.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.locks.ReentrantReadWriteLock$WriteLock)
14:11:18.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.locks.ReentrantReadWriteLock$WriteLock): calling childClassLoader.findClass()
14:11:18.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.locks.ReentrantReadWriteLock$WriteLock): class java.util.concurrent.locks.ReentrantReadWriteLock$WriteLock
14:11:18.361 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPluginContext -- <== cleanResourceMatchers()
14:11:18.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerReadWriteLock)
14:11:18.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerReadWriteLock): calling childClassLoader.findClass()
14:11:18.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerReadWriteLock)
14:11:18.361 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerReadWriteLock): calling childClassLoader().findClass() 
14:11:18.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerReadWriteLock): class org.apache.ranger.plugin.util.RangerReadWriteLock
14:11:18.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerReadWriteLock): class org.apache.ranger.plugin.util.RangerReadWriteLock
14:11:18.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.locks.Lock)
14:11:18.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.locks.Lock): calling childClassLoader.findClass()
14:11:18.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.locks.Lock): interface java.util.concurrent.locks.Lock
14:11:18.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerReadWriteLock$RangerLock)
14:11:18.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerReadWriteLock$RangerLock): calling childClassLoader.findClass()
14:11:18.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerReadWriteLock$RangerLock)
14:11:18.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerReadWriteLock$RangerLock): calling childClassLoader().findClass() 
14:11:18.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.AutoCloseable)
14:11:18.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.AutoCloseable): calling childClassLoader.findClass()
14:11:18.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.AutoCloseable): interface java.lang.AutoCloseable
14:11:18.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerReadWriteLock$RangerLock): class org.apache.ranger.plugin.util.RangerReadWriteLock$RangerLock
14:11:18.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerReadWriteLock$RangerLock): class org.apache.ranger.plugin.util.RangerReadWriteLock$RangerLock
14:11:18.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher)
14:11:18.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher): calling childClassLoader.findClass()
14:11:18.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher)
14:11:18.362 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher): calling childClassLoader().findClass() 
14:11:18.363 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher): class org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher
14:11:18.363 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher): class org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher
14:11:18.363 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.buildZoneTrie()
14:11:18.363 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.buildZoneTrie()
14:11:18.363 [main] DEBUG org.apache.ranger.plugin.util.RangerPolicyDeltaUtil -- ==> hasPolicyDeltas(servicePolicies:[serviceName=kms, serviceId=3, policyVersion=4, policyUpdateTime=Mon May 04 14:11:01 UTC 2026, policies=[RangerPolicy={id={3} guid={1d0269a7-54a8-4091-8f87-b7e039293eab} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}], tagPolicies=null, policyDeltas=null, serviceDef=RangerServiceDef={id={7} guid={025a584e-60a1-4f68-b6e0-9926060801e1} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:10:43 UTC 2026} updateTime={Mon May 04 14:10:43 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }, auditMode=audit-default, securityZones=null]
14:11:18.364 [main] DEBUG org.apache.ranger.plugin.util.RangerPolicyDeltaUtil -- <== hasPolicyDeltas(servicePolicies:[serviceName=kms, serviceId=3, policyVersion=4, policyUpdateTime=Mon May 04 14:11:01 UTC 2026, policies=[RangerPolicy={id={3} guid={1d0269a7-54a8-4091-8f87-b7e039293eab} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}], tagPolicies=null, policyDeltas=null, serviceDef=RangerServiceDef={id={7} guid={025a584e-60a1-4f68-b6e0-9926060801e1} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:10:43 UTC 2026} updateTime={Mon May 04 14:10:43 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }, auditMode=audit-default, securityZones=null], ret:[false]
14:11:18.364 [main] INFO org.apache.ranger.plugin.policyengine.PolicyEngine -- Policy engine will not perform in place update while processing policies.
14:11:18.364 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.service.RangerAuthContext)
14:11:18.364 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.service.RangerAuthContext): calling childClassLoader.findClass()
14:11:18.364 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.service.RangerAuthContext)
14:11:18.364 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.service.RangerAuthContext): calling childClassLoader().findClass() 
14:11:18.364 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.service.RangerAuthContext): class org.apache.ranger.plugin.service.RangerAuthContext
14:11:18.364 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.service.RangerAuthContext): class org.apache.ranger.plugin.service.RangerAuthContext
14:11:18.364 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRolesUtil)
14:11:18.364 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRolesUtil): calling childClassLoader.findClass()
14:11:18.364 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRolesUtil)
14:11:18.364 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRolesUtil): calling childClassLoader().findClass() 
14:11:18.365 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRolesUtil): class org.apache.ranger.plugin.util.RangerRolesUtil
14:11:18.365 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRolesUtil): class org.apache.ranger.plugin.util.RangerRolesUtil
14:11:18.365 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerUserStoreUtil)
14:11:18.365 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerUserStoreUtil): calling childClassLoader.findClass()
14:11:18.365 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerUserStoreUtil)
14:11:18.365 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerUserStoreUtil): calling childClassLoader().findClass() 
14:11:18.365 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerUserStoreUtil): class org.apache.ranger.plugin.util.RangerUserStoreUtil
14:11:18.365 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerUserStoreUtil): class org.apache.ranger.plugin.util.RangerUserStoreUtil
14:11:18.366 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository)
14:11:18.366 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository): calling childClassLoader.findClass()
14:11:18.366 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository)
14:11:18.366 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository): calling childClassLoader().findClass() 
14:11:18.367 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository): class org.apache.ranger.plugin.policyengine.RangerPolicyRepository
14:11:18.367 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository): class org.apache.ranger.plugin.policyengine.RangerPolicyRepository
14:11:18.367 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicy)
14:11:18.367 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicy): calling childClassLoader.findClass()
14:11:18.367 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicy)
14:11:18.367 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicy): calling childClassLoader().findClass() 
14:11:18.368 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicy): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicy
14:11:18.368 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicy): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicy
14:11:18.368 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator)
14:11:18.368 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator): calling childClassLoader.findClass()
14:11:18.368 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator)
14:11:18.368 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator): calling childClassLoader().findClass() 
14:11:18.369 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator)
14:11:18.369 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator): calling childClassLoader.findClass()
14:11:18.369 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator)
14:11:18.369 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator): calling childClassLoader().findClass() 
14:11:18.369 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator
14:11:18.369 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator
14:11:18.369 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator
14:11:18.369 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator
14:11:18.369 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerCachedPolicyEvaluator)
14:11:18.369 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerCachedPolicyEvaluator): calling childClassLoader.findClass()
14:11:18.369 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerCachedPolicyEvaluator)
14:11:18.369 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerCachedPolicyEvaluator): calling childClassLoader().findClass() 
14:11:18.369 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator)
14:11:18.369 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator): calling childClassLoader.findClass()
14:11:18.370 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator)
14:11:18.370 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator): calling childClassLoader().findClass() 
14:11:18.370 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator)
14:11:18.370 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator): calling childClassLoader.findClass()
14:11:18.370 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator)
14:11:18.370 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator): calling childClassLoader().findClass() 
14:11:18.371 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator
14:11:18.371 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator
14:11:18.372 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator
14:11:18.372 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator
14:11:18.372 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerCachedPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerCachedPolicyEvaluator
14:11:18.372 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerCachedPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerCachedPolicyEvaluator
14:11:18.372 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerResourceEvaluator)
14:11:18.372 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerResourceEvaluator): calling childClassLoader.findClass()
14:11:18.372 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerResourceEvaluator)
14:11:18.372 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerResourceEvaluator): calling childClassLoader().findClass() 
14:11:18.372 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerResourceEvaluator): interface org.apache.ranger.plugin.policyresourcematcher.RangerResourceEvaluator
14:11:18.372 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerResourceEvaluator): interface org.apache.ranger.plugin.policyresourcematcher.RangerResourceEvaluator
14:11:18.372 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository$AuditModeEnum)
14:11:18.372 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository$AuditModeEnum): calling childClassLoader.findClass()
14:11:18.373 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository$AuditModeEnum)
14:11:18.373 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository$AuditModeEnum): calling childClassLoader().findClass() 
14:11:18.373 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository$AuditModeEnum): class org.apache.ranger.plugin.policyengine.RangerPolicyRepository$AuditModeEnum
14:11:18.373 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyRepository$AuditModeEnum): class org.apache.ranger.plugin.policyengine.RangerPolicyRepository$AuditModeEnum
14:11:18.373 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- RangerPolicyRepository : building policy-repository for service[kms], and zone:[null] with auditMode[AUDIT_DEFAULT]
14:11:18.373 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper)
14:11:18.373 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper): calling childClassLoader.findClass()
14:11:18.373 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper)
14:11:18.373 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper): calling childClassLoader().findClass() 
14:11:18.374 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper): class org.apache.ranger.plugin.model.validation.RangerServiceDefHelper
14:11:18.374 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper): class org.apache.ranger.plugin.model.validation.RangerServiceDefHelper
14:11:18.374 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> RangerServiceDefHelper(). The RangerServiceDef: RangerServiceDef={id={7} guid={025a584e-60a1-4f68-b6e0-9926060801e1} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:10:43 UTC 2026} updateTime={Mon May 04 14:10:43 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }
14:11:18.374 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate)
14:11:18.374 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate): calling childClassLoader.findClass()
14:11:18.374 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate)
14:11:18.374 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate): calling childClassLoader().findClass() 
14:11:18.375 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate): class org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate
14:11:18.375 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate): class org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate
14:11:18.375 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$DirectedGraph)
14:11:18.375 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$DirectedGraph): calling childClassLoader.findClass()
14:11:18.375 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$DirectedGraph)
14:11:18.375 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$DirectedGraph): calling childClassLoader().findClass() 
14:11:18.375 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$DirectedGraph): class org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$DirectedGraph
14:11:18.375 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$DirectedGraph): class org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$DirectedGraph
14:11:18.376 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Objects)
14:11:18.376 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Objects): calling childClassLoader.findClass()
14:11:18.376 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Objects): class java.util.Objects
14:11:18.376 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Created graph for resources: _nodes={keyname=[]}
14:11:18.376 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Returning sources: [keyname]
14:11:18.376 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Returning sinks: [keyname]
14:11:18.376 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Returning sources: [keyname]
14:11:18.376 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Returning sinks: [keyname]
14:11:18.376 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.thirdparty.com.google.common.collect.Lists)
14:11:18.376 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.thirdparty.com.google.common.collect.Lists): calling childClassLoader.findClass()
14:11:18.376 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.thirdparty.com.google.common.collect.Lists)
14:11:18.376 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.thirdparty.com.google.common.collect.Lists): calling childClassLoader().findClass() 
14:11:18.376 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.thirdparty.com.google.common.collect.Lists): calling componentClassLoader.findClass()
14:11:18.376 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.thirdparty.com.google.common.collect.Lists): calling componentClassLoader.loadClass()
14:11:18.376 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.thirdparty.com.google.common.collect.Lists): class org.apache.hadoop.thirdparty.com.google.common.collect.Lists
14:11:18.380 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Created graph for resources: null
14:11:18.380 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Created graph for resources: null
14:11:18.381 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.function.Function)
14:11:18.381 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.function.Function): calling childClassLoader.findClass()
14:11:18.381 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.function.Function): interface java.util.function.Function
14:11:18.381 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.stream.Stream)
14:11:18.381 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.stream.Stream): calling childClassLoader.findClass()
14:11:18.381 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.stream.Stream): interface java.util.stream.Stream
14:11:18.381 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.stream.Collectors)
14:11:18.381 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.stream.Collectors): calling childClassLoader.findClass()
14:11:18.381 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.stream.Collectors): class java.util.stream.Collectors
14:11:18.382 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate$ResourceNameLevel)
14:11:18.382 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate$ResourceNameLevel): calling childClassLoader.findClass()
14:11:18.382 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate$ResourceNameLevel)
14:11:18.382 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate$ResourceNameLevel): calling childClassLoader().findClass() 
14:11:18.382 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate$ResourceNameLevel): class org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate$ResourceNameLevel
14:11:18.382 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate$ResourceNameLevel): class org.apache.ranger.plugin.model.validation.RangerServiceDefHelper$Delegate$ResourceNameLevel
14:11:18.382 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Found [3] resource hierarchies for service [kms] update-date[Mon May 04 14:10:43 UTC 2026]: {0=[[RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]], 1=[], 2=[]}
14:11:18.383 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.buildPolicyEvaluator(RangerPolicy={id={3} guid={1d0269a7-54a8-4091-8f87-b7e039293eab} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }},RangerServiceDef={id={7} guid={025a584e-60a1-4f68-b6e0-9926060801e1} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:10:43 UTC 2026} updateTime={Mon May 04 14:10:43 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }, PolicyEngineOptions: { evaluatorType: auto, evaluateDelegateAdminOnly: false, disableContextEnrichers: false, disableCustomConditions: false, disableTagPolicyEvaluation: false, disablePolicyRefresher: false, disableTagRetriever: false, disableUserStoreRetriever: false, enableTagEnricherWithLocalRefresher: false, enableUserStoreEnricherWithLocalRefresher: false, disableTrieLookupPrefilter: false, optimizeTrieForRetrieval: false, cacheAuditResult: false, disableRoleResolution: true, optimizeTrieForSpace: false, optimizeTagTrieForRetrieval: false, optimizeTagTrieForSpace: false, enableResourceMatcherReuse: true })
14:11:18.383 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.scrubPolicy(RangerPolicy={id={3} guid={1d0269a7-54a8-4091-8f87-b7e039293eab} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }})
14:11:18.383 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.scrubPolicyItems(3): 
14:11:18.383 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.scrubPolicyItems(3): 
14:11:18.383 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.scrubPolicyItems(3): 
14:11:18.383 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.scrubPolicyItems(3): 
14:11:18.383 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.scrubPolicyItems(3): 
14:11:18.383 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.scrubPolicyItems(3): 
14:11:18.383 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.scrubPolicyItems(3): 
14:11:18.383 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.scrubPolicyItems(3): 
14:11:18.383 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.scrubPolicyItems(3): 
14:11:18.383 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.scrubPolicyItems(3): 
14:11:18.383 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.scrubPolicyItems(3): 
14:11:18.383 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.scrubPolicyItems(3): 
14:11:18.383 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.scrubPolicy(RangerPolicy={id={3} guid={1d0269a7-54a8-4091-8f87-b7e039293eab} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}): false
14:11:18.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator)
14:11:18.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator): calling childClassLoader.findClass()
14:11:18.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator)
14:11:18.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator): calling childClassLoader().findClass() 
14:11:18.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator
14:11:18.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator
14:11:18.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerDataMaskPolicyItemEvaluator)
14:11:18.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerDataMaskPolicyItemEvaluator): calling childClassLoader.findClass()
14:11:18.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerDataMaskPolicyItemEvaluator)
14:11:18.384 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerDataMaskPolicyItemEvaluator): calling childClassLoader().findClass() 
14:11:18.385 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerDataMaskPolicyItemEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerDataMaskPolicyItemEvaluator
14:11:18.385 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerDataMaskPolicyItemEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerDataMaskPolicyItemEvaluator
14:11:18.385 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerRowFilterPolicyItemEvaluator)
14:11:18.385 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerRowFilterPolicyItemEvaluator): calling childClassLoader.findClass()
14:11:18.385 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerRowFilterPolicyItemEvaluator)
14:11:18.385 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerRowFilterPolicyItemEvaluator): calling childClassLoader().findClass() 
14:11:18.385 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerRowFilterPolicyItemEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerRowFilterPolicyItemEvaluator
14:11:18.385 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerRowFilterPolicyItemEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerRowFilterPolicyItemEvaluator
14:11:18.385 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyEvalOrderComparator)
14:11:18.385 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyEvalOrderComparator): calling childClassLoader.findClass()
14:11:18.385 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyEvalOrderComparator)
14:11:18.385 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyEvalOrderComparator): calling childClassLoader().findClass() 
14:11:18.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyEvalOrderComparator): class org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyEvalOrderComparator
14:11:18.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyEvalOrderComparator): class org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyEvalOrderComparator
14:11:18.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyNameComparator)
14:11:18.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyNameComparator): calling childClassLoader.findClass()
14:11:18.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyNameComparator)
14:11:18.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyNameComparator): calling childClassLoader().findClass() 
14:11:18.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyNameComparator): class org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyNameComparator
14:11:18.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyNameComparator): class org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$PolicyNameComparator
14:11:18.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$1)
14:11:18.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$1): calling childClassLoader.findClass()
14:11:18.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$1)
14:11:18.386 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$1): calling childClassLoader().findClass() 
14:11:18.387 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$1): class org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$1
14:11:18.387 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$1): class org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$1
14:11:18.388 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator -- ==> RangerOptimizedPolicyEvaluator.init()
14:11:18.388 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.init()
14:11:18.388 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- ==> RangerAbstractPolicyEvaluator.init(RangerPolicy={id={3} guid={1d0269a7-54a8-4091-8f87-b7e039293eab} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}, RangerServiceDef={id={7} guid={025a584e-60a1-4f68-b6e0-9926060801e1} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:10:43 UTC 2026} updateTime={Mon May 04 14:10:43 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} })
14:11:18.388 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- ==> RangerAbstractPolicyEvaluator.getPrunedPolicy(RangerPolicy={id={3} guid={1d0269a7-54a8-4091-8f87-b7e039293eab} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }})
14:11:18.389 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- <== RangerAbstractPolicyEvaluator.getPrunedPolicy(isPruningNeeded=false) : RangerPolicy={id={3} guid={1d0269a7-54a8-4091-8f87-b7e039293eab} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}
14:11:18.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator)
14:11:18.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator): calling childClassLoader.findClass()
14:11:18.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator)
14:11:18.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator): calling childClassLoader().findClass() 
14:11:18.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$RangerPolicyResourceEvaluator)
14:11:18.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$RangerPolicyResourceEvaluator): calling childClassLoader.findClass()
14:11:18.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$RangerPolicyResourceEvaluator)
14:11:18.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$RangerPolicyResourceEvaluator): calling childClassLoader().findClass() 
14:11:18.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$RangerPolicyResourceEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$RangerPolicyResourceEvaluator
14:11:18.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$RangerPolicyResourceEvaluator): interface org.apache.ranger.plugin.policyevaluator.RangerPolicyEvaluator$RangerPolicyResourceEvaluator
14:11:18.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator
14:11:18.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator
14:11:18.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher)
14:11:18.389 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher): calling childClassLoader.findClass()
14:11:18.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher)
14:11:18.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher): calling childClassLoader().findClass() 
14:11:18.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher): interface org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher
14:11:18.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher): interface org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher
14:11:18.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher)
14:11:18.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher): calling childClassLoader.findClass()
14:11:18.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher)
14:11:18.390 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher): calling childClassLoader().findClass() 
14:11:18.391 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher): class org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher
14:11:18.391 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher): class org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher
14:11:18.392 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.RangerResourceMatcher)
14:11:18.392 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.RangerResourceMatcher): calling childClassLoader.findClass()
14:11:18.392 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.RangerResourceMatcher)
14:11:18.392 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.RangerResourceMatcher): calling childClassLoader().findClass() 
14:11:18.392 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.RangerResourceMatcher): interface org.apache.ranger.plugin.resourcematcher.RangerResourceMatcher
14:11:18.392 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.RangerResourceMatcher): interface org.apache.ranger.plugin.resourcematcher.RangerResourceMatcher
14:11:18.392 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.init()
14:11:18.395 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> getResourceHierarchies(policyType=0, keys=keyname)
14:11:18.395 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:11:18.395 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:11:18.395 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== getResourceHierarchies(policyType=0, keys=keyname) : [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:11:18.395 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:11:18.395 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:11:18.395 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.createResourceMatcher(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} })
14:11:18.395 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPluginContext -- ==> getResourceMatcher(resourceDefName=keyname, resource=RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} })
14:11:18.395 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.locks.ReentrantReadWriteLock$ReadLock)
14:11:18.395 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.locks.ReentrantReadWriteLock$ReadLock): calling childClassLoader.findClass()
14:11:18.395 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.locks.ReentrantReadWriteLock$ReadLock): class java.util.concurrent.locks.ReentrantReadWriteLock$ReadLock
14:11:18.395 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPluginContext -- <== getResourceMatcher(resourceDefName=keyname, resource=RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }) : ret=null
14:11:18.395 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher)
14:11:18.395 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher): calling childClassLoader.findClass()
14:11:18.395 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher)
14:11:18.395 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher): calling childClassLoader().findClass() 
14:11:18.396 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher)
14:11:18.396 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher): calling childClassLoader.findClass()
14:11:18.396 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher)
14:11:18.396 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher): calling childClassLoader().findClass() 
14:11:18.396 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher): class org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher
14:11:18.396 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher): class org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher
14:11:18.396 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher): class org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher
14:11:18.396 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher): class org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher
14:11:18.397 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher)
14:11:18.397 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher): calling childClassLoader.findClass()
14:11:18.397 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher)
14:11:18.397 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher): calling childClassLoader().findClass() 
14:11:18.397 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher): class org.apache.ranger.plugin.resourcematcher.ResourceMatcher
14:11:18.397 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher): class org.apache.ranger.plugin.resourcematcher.ResourceMatcher
14:11:18.397 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveWildcardMatcher)
14:11:18.397 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveWildcardMatcher): calling childClassLoader.findClass()
14:11:18.397 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveWildcardMatcher)
14:11:18.397 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveWildcardMatcher): calling childClassLoader().findClass() 
14:11:18.398 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.AbstractStringResourceMatcher)
14:11:18.398 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.AbstractStringResourceMatcher): calling childClassLoader.findClass()
14:11:18.398 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.AbstractStringResourceMatcher)
14:11:18.398 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.AbstractStringResourceMatcher): calling childClassLoader().findClass() 
14:11:18.398 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.AbstractStringResourceMatcher): class org.apache.ranger.plugin.resourcematcher.AbstractStringResourceMatcher
14:11:18.398 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.AbstractStringResourceMatcher): class org.apache.ranger.plugin.resourcematcher.AbstractStringResourceMatcher
14:11:18.398 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveWildcardMatcher): class org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveWildcardMatcher
14:11:18.398 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveWildcardMatcher): class org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveWildcardMatcher
14:11:18.398 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveWildcardMatcher)
14:11:18.398 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveWildcardMatcher): calling childClassLoader.findClass()
14:11:18.398 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveWildcardMatcher)
14:11:18.398 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveWildcardMatcher): calling childClassLoader().findClass() 
14:11:18.398 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveWildcardMatcher): class org.apache.ranger.plugin.resourcematcher.CaseInsensitiveWildcardMatcher
14:11:18.398 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveWildcardMatcher): class org.apache.ranger.plugin.resourcematcher.CaseInsensitiveWildcardMatcher
14:11:18.398 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveWildcardMatcher)
14:11:18.398 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveWildcardMatcher): calling childClassLoader.findClass()
14:11:18.398 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveWildcardMatcher)
14:11:18.399 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveWildcardMatcher): calling childClassLoader().findClass() 
14:11:18.399 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveWildcardMatcher): class org.apache.ranger.plugin.resourcematcher.CaseSensitiveWildcardMatcher
14:11:18.399 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveWildcardMatcher): class org.apache.ranger.plugin.resourcematcher.CaseSensitiveWildcardMatcher
14:11:18.399 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStringMatcher)
14:11:18.399 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStringMatcher): calling childClassLoader.findClass()
14:11:18.399 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStringMatcher)
14:11:18.399 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStringMatcher): calling childClassLoader().findClass() 
14:11:18.399 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStringMatcher): class org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStringMatcher
14:11:18.399 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStringMatcher): class org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStringMatcher
14:11:18.399 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStringMatcher)
14:11:18.399 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStringMatcher): calling childClassLoader.findClass()
14:11:18.399 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStringMatcher)
14:11:18.399 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStringMatcher): calling childClassLoader().findClass() 
14:11:18.399 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStringMatcher): class org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStringMatcher
14:11:18.399 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStringMatcher): class org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStringMatcher
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStringMatcher)
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStringMatcher): calling childClassLoader.findClass()
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStringMatcher)
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStringMatcher): calling childClassLoader().findClass() 
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStringMatcher): class org.apache.ranger.plugin.resourcematcher.CaseSensitiveStringMatcher
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStringMatcher): class org.apache.ranger.plugin.resourcematcher.CaseSensitiveStringMatcher
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveEndsWithMatcher)
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveEndsWithMatcher): calling childClassLoader.findClass()
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveEndsWithMatcher)
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveEndsWithMatcher): calling childClassLoader().findClass() 
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveEndsWithMatcher): class org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveEndsWithMatcher
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveEndsWithMatcher): class org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveEndsWithMatcher
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveEndsWithMatcher)
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveEndsWithMatcher): calling childClassLoader.findClass()
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveEndsWithMatcher)
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveEndsWithMatcher): calling childClassLoader().findClass() 
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveEndsWithMatcher): class org.apache.ranger.plugin.resourcematcher.CaseInsensitiveEndsWithMatcher
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveEndsWithMatcher): class org.apache.ranger.plugin.resourcematcher.CaseInsensitiveEndsWithMatcher
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveEndsWithMatcher)
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveEndsWithMatcher): calling childClassLoader.findClass()
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveEndsWithMatcher)
14:11:18.400 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveEndsWithMatcher): calling childClassLoader().findClass() 
14:11:18.401 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveEndsWithMatcher): class org.apache.ranger.plugin.resourcematcher.CaseSensitiveEndsWithMatcher
14:11:18.401 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveEndsWithMatcher): class org.apache.ranger.plugin.resourcematcher.CaseSensitiveEndsWithMatcher
14:11:18.401 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStartsWithMatcher)
14:11:18.401 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStartsWithMatcher): calling childClassLoader.findClass()
14:11:18.401 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStartsWithMatcher)
14:11:18.401 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStartsWithMatcher): calling childClassLoader().findClass() 
14:11:18.401 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStartsWithMatcher): class org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStartsWithMatcher
14:11:18.401 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStartsWithMatcher): class org.apache.ranger.plugin.resourcematcher.QuotedCaseSensitiveStartsWithMatcher
14:11:18.401 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStartsWithMatcher)
14:11:18.401 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStartsWithMatcher): calling childClassLoader.findClass()
14:11:18.401 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStartsWithMatcher)
14:11:18.401 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStartsWithMatcher): calling childClassLoader().findClass() 
14:11:18.401 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStartsWithMatcher): class org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStartsWithMatcher
14:11:18.402 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStartsWithMatcher): class org.apache.ranger.plugin.resourcematcher.CaseInsensitiveStartsWithMatcher
14:11:18.402 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStartsWithMatcher)
14:11:18.402 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStartsWithMatcher): calling childClassLoader.findClass()
14:11:18.402 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStartsWithMatcher)
14:11:18.402 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStartsWithMatcher): calling childClassLoader().findClass() 
14:11:18.402 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStartsWithMatcher): class org.apache.ranger.plugin.resourcematcher.CaseSensitiveStartsWithMatcher
14:11:18.402 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.CaseSensitiveStartsWithMatcher): class org.apache.ranger.plugin.resourcematcher.CaseSensitiveStartsWithMatcher
14:11:18.402 [main] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- ==> RangerAbstractResourceMatcher.init()
14:11:18.403 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRequestExprResolver)
14:11:18.403 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRequestExprResolver): calling childClassLoader.findClass()
14:11:18.403 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRequestExprResolver)
14:11:18.403 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRequestExprResolver): calling childClassLoader().findClass() 
14:11:18.403 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerRequestExprResolver): class org.apache.ranger.plugin.util.RangerRequestExprResolver
14:11:18.403 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerRequestExprResolver): class org.apache.ranger.plugin.util.RangerRequestExprResolver
14:11:18.403 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.regex.Matcher)
14:11:18.403 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.regex.Matcher): calling childClassLoader.findClass()
14:11:18.403 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.regex.Matcher): class java.util.regex.Matcher
14:11:18.403 [main] DEBUG org.apache.ranger.plugin.resourcematcher.ResourceMatcher -- ==> setDelimiters(value= , startDelimiter={, endDelimiter=}, escapeChar=\, prefix=
14:11:18.403 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.StringTokenReplacer)
14:11:18.404 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.StringTokenReplacer): calling childClassLoader.findClass()
14:11:18.404 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.StringTokenReplacer)
14:11:18.404 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.StringTokenReplacer): calling childClassLoader().findClass() 
14:11:18.404 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.StringTokenReplacer): class org.apache.ranger.plugin.util.StringTokenReplacer
14:11:18.404 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.StringTokenReplacer): class org.apache.ranger.plugin.util.StringTokenReplacer
14:11:18.404 [main] DEBUG org.apache.ranger.plugin.resourcematcher.ResourceMatcher -- <== setDelimiters(value= , startDelimiter={, endDelimiter=}, escapeChar=\, prefix=
14:11:18.404 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$PriorityComparator)
14:11:18.404 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$PriorityComparator): calling childClassLoader.findClass()
14:11:18.404 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$PriorityComparator)
14:11:18.404 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$PriorityComparator): calling childClassLoader().findClass() 
14:11:18.404 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$PriorityComparator): class org.apache.ranger.plugin.resourcematcher.ResourceMatcher$PriorityComparator
14:11:18.404 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$PriorityComparator): class org.apache.ranger.plugin.resourcematcher.ResourceMatcher$PriorityComparator
14:11:18.404 [main] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- <== RangerAbstractResourceMatcher.init()
14:11:18.404 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPluginContext -- ==> setResourceMatcher(resourceDefName=keyname, resource=RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }, matcher=RangerDefaultResourceMatcher={RangerAbstractResourceMatcher={resourceDef={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} policyResource={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} optIgnoreCase={false} optQuotedCaseSensitive={false} optQuoteChars={"} optWildCard={true} policyValues={*,} policyIsExcludes={false} isMatchAny={true} options={wildCard=true;ignoreCase=false;} }})
14:11:18.405 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPluginContext -- <== setResourceMatcher(resourceDefName=keyname, resource=RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }, matcher=RangerDefaultResourceMatcher={RangerAbstractResourceMatcher={resourceDef={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} policyResource={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} optIgnoreCase={false} optQuotedCaseSensitive={false} optQuoteChars={"} optWildCard={true} policyValues={*,} policyIsExcludes={false} isMatchAny={true} options={wildCard=true;ignoreCase=false;} }})
14:11:18.405 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.createResourceMatcher(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }): RangerDefaultResourceMatcher={RangerAbstractResourceMatcher={resourceDef={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} policyResource={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} optIgnoreCase={false} optQuotedCaseSensitive={false} optQuoteChars={"} optWildCard={true} policyValues={*,} policyIsExcludes={false} isMatchAny={true} options={wildCard=true;ignoreCase=false;} }}
14:11:18.405 [main] DEBUG org.apache.ranger.perf.policyresourcematcher.init -- [PERF]:main:RangerDefaultPolicyResourceMatcher.init():11786519:13134543
14:11:18.405 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.init(): ret=true
14:11:18.406 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- <== RangerAbstractPolicyEvaluator.init(RangerPolicy={id={3} guid={1d0269a7-54a8-4091-8f87-b7e039293eab} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}, RangerServiceDef={id={7} guid={025a584e-60a1-4f68-b6e0-9926060801e1} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:10:43 UTC 2026} updateTime={Mon May 04 14:10:43 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} })
14:11:18.406 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator)
14:11:18.406 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator): calling childClassLoader.findClass()
14:11:18.406 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator)
14:11:18.406 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator): calling childClassLoader().findClass() 
14:11:18.406 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyItemEvaluator)
14:11:18.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyItemEvaluator): calling childClassLoader.findClass()
14:11:18.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyItemEvaluator)
14:11:18.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyItemEvaluator): calling childClassLoader().findClass() 
14:11:18.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyItemEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyItemEvaluator
14:11:18.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyItemEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyItemEvaluator
14:11:18.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator
14:11:18.407 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator
14:11:18.408 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator(policyId=3, policyItem=RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, serviceType=kms, conditionsDisabled=false)
14:11:18.408 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator)
14:11:18.408 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator): calling childClassLoader.findClass()
14:11:18.408 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator)
14:11:18.408 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator): calling childClassLoader().findClass() 
14:11:18.408 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator
14:11:18.408 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator
14:11:18.408 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator$SingletonHolder)
14:11:18.408 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator$SingletonHolder): calling childClassLoader.findClass()
14:11:18.408 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator$SingletonHolder)
14:11:18.408 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator$SingletonHolder): calling childClassLoader().findClass() 
14:11:18.409 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator$SingletonHolder): class org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator$SingletonHolder
14:11:18.409 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator$SingletonHolder): class org.apache.ranger.plugin.policyevaluator.RangerCustomConditionEvaluator$SingletonHolder
14:11:18.409 [main] DEBUG org.apache.ranger.perf.policyitem.init -- [PERF]:main:RangerPolicyItemEvaluator.getPolicyItemConditionEvaluators(policyId=3, policyItemIndex=1):30647:31011
14:11:18.409 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator(policyId=3, conditionsCount=0)
14:11:18.409 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator(policyId=3, policyItem=RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, serviceType=kms, conditionsDisabled=false)
14:11:18.409 [main] DEBUG org.apache.ranger.perf.policyitem.init -- [PERF]:main:RangerPolicyItemEvaluator.getPolicyItemConditionEvaluators(policyId=3, policyItemIndex=2):13817:15628
14:11:18.409 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator(policyId=3, conditionsCount=0)
14:11:18.409 [main] DEBUG org.apache.ranger.perf.policy.init -- [PERF]:main:RangerPolicyEvaluator.getPolicyConditionEvaluators(policyId=3):17289:18382
14:11:18.409 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator$EvalOrderComparator)
14:11:18.409 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator$EvalOrderComparator): calling childClassLoader.findClass()
14:11:18.409 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator$EvalOrderComparator)
14:11:18.409 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator$EvalOrderComparator): calling childClassLoader().findClass() 
14:11:18.410 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator$EvalOrderComparator): class org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator$EvalOrderComparator
14:11:18.410 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator$EvalOrderComparator): class org.apache.ranger.plugin.policyevaluator.RangerPolicyItemEvaluator$EvalOrderComparator
14:11:18.410 [main] DEBUG org.apache.ranger.perf.policy.init -- [PERF]:main:RangerPolicyEvaluator.init(policyId=3, policyName=all):19701965:21820286
14:11:18.410 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.init()
14:11:18.410 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator -- ==> RangerOptimizedPolicyEvaluator.checkIfHasAllPerms()
14:11:18.410 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator -- ==> RangerOptimizedPolicyEvaluator.checkIfHasAllPerms(), false
14:11:18.410 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator -- ==> RangerOptimizedPolicyEvaluator.computeEvalOrder()
14:11:18.410 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator$LevelResourceNames)
14:11:18.410 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator$LevelResourceNames): calling childClassLoader.findClass()
14:11:18.410 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator$LevelResourceNames)
14:11:18.410 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator$LevelResourceNames): calling childClassLoader().findClass() 
14:11:18.410 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator$LevelResourceNames): class org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator$LevelResourceNames
14:11:18.410 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator$LevelResourceNames): class org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator$LevelResourceNames
14:11:18.411 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator -- <== RangerOptimizedPolicyEvaluator.computeEvalOrder(), policyName:all, priority:9930
14:11:18.411 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerOptimizedPolicyEvaluator -- <== RangerOptimizedPolicyEvaluator.init()
14:11:18.411 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.buildPolicyEvaluator(RangerPolicy={id={3} guid={1d0269a7-54a8-4091-8f87-b7e039293eab} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }},RangerServiceDef={id={7} guid={025a584e-60a1-4f68-b6e0-9926060801e1} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:10:43 UTC 2026} updateTime={Mon May 04 14:10:43 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }): RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={3} guid={1d0269a7-54a8-4091-8f87-b7e039293eab} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={{RangerDefaultResourceMatcher={RangerAbstractResourceMatcher={resourceDef={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} policyResource={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} optIgnoreCase={false} optQuotedCaseSensitive={false} optQuoteChars={"} optWildCard={true} policyValues={*,} policyIsExcludes={false} isMatchAny={true} options={wildCard=true;ignoreCase=false;} }}} } }} }
14:11:18.411 [main] INFO org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- This policy engine contains 1 policy evaluators
14:11:18.411 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- policy evaluation order: 1 policies
14:11:18.411 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- policy evaluation order: #1 - policy id=3; name=all; evalOrder=9930
14:11:18.411 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- dataMask policy evaluation order: 0 policies
14:11:18.411 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- rowFilter policy evaluation order: 0 policies
14:11:18.411 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- audit policy evaluation order: 0 policies
14:11:18.412 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.AuditFilter$AccessResult)
14:11:18.412 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.AuditFilter$AccessResult): calling childClassLoader.findClass()
14:11:18.412 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.AuditFilter$AccessResult)
14:11:18.412 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.AuditFilter$AccessResult): calling childClassLoader().findClass() 
14:11:18.413 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.model.AuditFilter$AccessResult): class org.apache.ranger.plugin.model.AuditFilter$AccessResult
14:11:18.413 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.model.AuditFilter$AccessResult): class org.apache.ranger.plugin.model.AuditFilter$AccessResult
14:11:18.417 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator)
14:11:18.417 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator): calling childClassLoader.findClass()
14:11:18.417 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator)
14:11:18.417 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator): calling childClassLoader().findClass() 
14:11:18.418 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator
14:11:18.418 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator
14:11:18.418 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItem)
14:11:18.418 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItem): calling childClassLoader.findClass()
14:11:18.418 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItem)
14:11:18.418 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItem): calling childClassLoader().findClass() 
14:11:18.418 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItem): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItem
14:11:18.418 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItem): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItem
14:11:18.418 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyEvaluator(auditFilter={accessResult=DENIED, resources=null, accessTypes=null, actions=null, users=null, groups=null, roles=null, isAudited=true}, priority=2, matchAnyResource=true)
14:11:18.419 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.init(2)
14:11:18.419 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.init()
14:11:18.419 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- ==> RangerAbstractPolicyEvaluator.init(RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}, RangerServiceDef={id={7} guid={025a584e-60a1-4f68-b6e0-9926060801e1} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:10:43 UTC 2026} updateTime={Mon May 04 14:10:43 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} })
14:11:18.419 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- ==> RangerAbstractPolicyEvaluator.getPrunedPolicy(RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }})
14:11:18.419 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- <== RangerAbstractPolicyEvaluator.getPrunedPolicy(isPruningNeeded=false) : RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}
14:11:18.419 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.init()
14:11:18.419 [main] DEBUG org.apache.ranger.perf.policyresourcematcher.init -- [PERF]:main:RangerDefaultPolicyResourceMatcher.init():28595:30194
14:11:18.419 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.init(): ret=true
14:11:18.420 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- <== RangerAbstractPolicyEvaluator.init(RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}, RangerServiceDef={id={7} guid={025a584e-60a1-4f68-b6e0-9926060801e1} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:10:43 UTC 2026} updateTime={Mon May 04 14:10:43 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} })
14:11:18.420 [main] DEBUG org.apache.ranger.perf.policy.init -- [PERF]:main:RangerPolicyEvaluator.getPolicyConditionEvaluators(policyId=2):19230:20550
14:11:18.420 [main] DEBUG org.apache.ranger.perf.policy.init -- [PERF]:main:RangerPolicyEvaluator.init(policyId=2, policyName=null):1193668:1341720
14:11:18.420 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.init()
14:11:18.420 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItemEvaluator)
14:11:18.420 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItemEvaluator): calling childClassLoader.findClass()
14:11:18.420 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItemEvaluator)
14:11:18.420 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItemEvaluator): calling childClassLoader().findClass() 
14:11:18.421 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItemEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItemEvaluator
14:11:18.421 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItemEvaluator): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$RangerAuditPolicyItemEvaluator
14:11:18.421 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator(RangerAuditPolicyItem={RangerPolicyItem={accessTypes={} users={} groups={} roles={} conditions={} delegateAdmin={false} } accessResult={DENIED} actions={} accessTypes={} isAudited={true}}, matchAnyUser=true, matchAnyAction=true, hasResourceOwner=false)
14:11:18.421 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.init(2)
14:11:18.421 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyEvaluator(auditFilter={accessResult=null, resources=null, accessTypes=null, actions=null, users=[keyadmin], groups=null, roles=null, isAudited=false}, priority=1, matchAnyResource=true)
14:11:18.421 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.init(1)
14:11:18.421 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.init()
14:11:18.421 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- ==> RangerAbstractPolicyEvaluator.init(RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}, RangerServiceDef={id={7} guid={025a584e-60a1-4f68-b6e0-9926060801e1} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:10:43 UTC 2026} updateTime={Mon May 04 14:10:43 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} })
14:11:18.422 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- ==> RangerAbstractPolicyEvaluator.getPrunedPolicy(RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }})
14:11:18.422 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- <== RangerAbstractPolicyEvaluator.getPrunedPolicy(isPruningNeeded=false) : RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}
14:11:18.422 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.init()
14:11:18.422 [main] DEBUG org.apache.ranger.perf.policyresourcematcher.init -- [PERF]:main:RangerDefaultPolicyResourceMatcher.init():21162:29321
14:11:18.422 [main] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.init(): ret=true
14:11:18.422 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator -- <== RangerAbstractPolicyEvaluator.init(RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}, RangerServiceDef={id={7} guid={025a584e-60a1-4f68-b6e0-9926060801e1} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:10:43 UTC 2026} updateTime={Mon May 04 14:10:43 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} })
14:11:18.422 [main] DEBUG org.apache.ranger.perf.policy.init -- [PERF]:main:RangerPolicyEvaluator.getPolicyConditionEvaluators(policyId=1):14646:16038
14:11:18.422 [main] DEBUG org.apache.ranger.perf.policy.init -- [PERF]:main:RangerPolicyEvaluator.init(policyId=1, policyName=null):715251:752584
14:11:18.422 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.init()
14:11:18.422 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator(RangerAuditPolicyItem={RangerPolicyItem={accessTypes={} users={keyadmin } groups={} roles={} conditions={} delegateAdmin={false} } accessResult={null} actions={} accessTypes={} isAudited={false}}, matchAnyUser=false, matchAnyAction=true, hasResourceOwner=false)
14:11:18.422 [main] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.init(1)
14:11:18.422 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie)
14:11:18.423 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie): calling childClassLoader.findClass()
14:11:18.423 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie)
14:11:18.423 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie): calling childClassLoader().findClass() 
14:11:18.424 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie): class org.apache.ranger.plugin.policyengine.RangerResourceTrie
14:11:18.424 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie): class org.apache.ranger.plugin.policyengine.RangerResourceTrie
14:11:18.424 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TraverseMatchHandler)
14:11:18.424 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TraverseMatchHandler): calling childClassLoader.findClass()
14:11:18.424 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TraverseMatchHandler)
14:11:18.424 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TraverseMatchHandler): calling childClassLoader().findClass() 
14:11:18.425 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TraverseMatchHandler): interface org.apache.ranger.plugin.policyengine.RangerResourceTrie$TraverseMatchHandler
14:11:18.425 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TraverseMatchHandler): interface org.apache.ranger.plugin.policyengine.RangerResourceTrie$TraverseMatchHandler
14:11:18.425 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie(keyname, evaluatorCount=1, isOptimizedForRetrieval=false, isOptimizedForSpace=false)
14:11:18.425 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> buildTrie(keyname, evaluatorCount=1, isMultiThreaded=false)
14:11:18.425 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieNode)
14:11:18.425 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieNode): calling childClassLoader.findClass()
14:11:18.425 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieNode)
14:11:18.426 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieNode): calling childClassLoader().findClass() 
14:11:18.426 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieNode): class org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieNode
14:11:18.426 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieNode): class org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieNode
14:11:18.427 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- [PERF]:main:RangerResourceTrie.init(resourceDef=keyname):1276769:1319055
14:11:18.427 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== buildTrie(keyname, evaluatorCount=1, isMultiThreaded=false) :nodeValue=ROOT; isSetup=false; isSharingParentWildcardEvaluators=false; childCount=0; evaluators=[]; wildcardEvaluators=[ 1|,|]
14:11:18.427 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- [PERF]:main:RangerResourceTrie.init(name=keyname):1669679:1709535
14:11:18.427 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieData)
14:11:18.427 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieData): calling childClassLoader.findClass()
14:11:18.427 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieData)
14:11:18.427 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieData): calling childClassLoader().findClass() 
14:11:18.427 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieData): class org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieData
14:11:18.427 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieData): class org.apache.ranger.plugin.policyengine.RangerResourceTrie$TrieData
14:11:18.427 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- resourceName=keyname; optIgnoreCase=false; optWildcard=true; wildcardChars=*?{}\$; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=1; evaluatorListRefCount=0; wildcardEvaluatorListRefCount=0
14:11:18.428 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie(keyname, evaluatorCount=1, isOptimizedForRetrieval=false, isOptimizedForSpace=false): resourceName=keyname; optIgnoreCase=false; optWildcard=true; wildcardChars=*?{}\$; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=1; evaluatorListRefCount=0; wildcardEvaluatorListRefCount=0
14:11:18.428 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie(keyname, evaluatorCount=0, isOptimizedForRetrieval=false, isOptimizedForSpace=false)
14:11:18.428 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> buildTrie(keyname, evaluatorCount=0, isMultiThreaded=false)
14:11:18.428 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- [PERF]:main:RangerResourceTrie.init(resourceDef=keyname):14042:16743
14:11:18.428 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== buildTrie(keyname, evaluatorCount=0, isMultiThreaded=false) :nodeValue=ROOT; isSetup=false; isSharingParentWildcardEvaluators=false; childCount=0; evaluators=[]; wildcardEvaluators=[ ]
14:11:18.428 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- [PERF]:main:RangerResourceTrie.init(name=keyname):160757:157449
14:11:18.428 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- resourceName=keyname; optIgnoreCase=false; optWildcard=true; wildcardChars=*?{}\$; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=0; evaluatorListRefCount=0; wildcardEvaluatorListRefCount=0
14:11:18.428 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie(keyname, evaluatorCount=0, isOptimizedForRetrieval=false, isOptimizedForSpace=false): resourceName=keyname; optIgnoreCase=false; optWildcard=true; wildcardChars=*?{}\$; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=0; evaluatorListRefCount=0; wildcardEvaluatorListRefCount=0
14:11:18.428 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie(keyname, evaluatorCount=0, isOptimizedForRetrieval=false, isOptimizedForSpace=false)
14:11:18.428 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> buildTrie(keyname, evaluatorCount=0, isMultiThreaded=false)
14:11:18.428 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- [PERF]:main:RangerResourceTrie.init(resourceDef=keyname):20838:22885
14:11:18.428 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== buildTrie(keyname, evaluatorCount=0, isMultiThreaded=false) :nodeValue=ROOT; isSetup=false; isSharingParentWildcardEvaluators=false; childCount=0; evaluators=[]; wildcardEvaluators=[ ]
14:11:18.428 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- [PERF]:main:RangerResourceTrie.init(name=keyname):182058:234706
14:11:18.428 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- resourceName=keyname; optIgnoreCase=false; optWildcard=true; wildcardChars=*?{}\$; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=0; evaluatorListRefCount=0; wildcardEvaluatorListRefCount=0
14:11:18.428 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie(keyname, evaluatorCount=0, isOptimizedForRetrieval=false, isOptimizedForSpace=false): resourceName=keyname; optIgnoreCase=false; optWildcard=true; wildcardChars=*?{}\$; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=0; evaluatorListRefCount=0; wildcardEvaluatorListRefCount=0
14:11:18.428 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie(keyname, evaluatorCount=2, isOptimizedForRetrieval=false, isOptimizedForSpace=false)
14:11:18.428 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> buildTrie(keyname, evaluatorCount=2, isMultiThreaded=false)
14:11:18.429 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- [PERF]:main:RangerResourceTrie.init(resourceDef=keyname):55520:57885
14:11:18.429 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== buildTrie(keyname, evaluatorCount=2, isMultiThreaded=false) :nodeValue=ROOT; isSetup=false; isSharingParentWildcardEvaluators=false; childCount=0; evaluators=[]; wildcardEvaluators=[ ]
14:11:18.429 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- [PERF]:main:RangerResourceTrie.init(name=keyname):198904:268823
14:11:18.429 [main] DEBUG org.apache.ranger.perf.resourcetrie.init -- resourceName=keyname; optIgnoreCase=false; optWildcard=true; wildcardChars=*?{}\$; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=0; evaluatorListRefCount=0; wildcardEvaluatorListRefCount=0
14:11:18.429 [main] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie(keyname, evaluatorCount=2, isOptimizedForRetrieval=false, isOptimizedForSpace=false): resourceName=keyname; optIgnoreCase=false; optWildcard=true; wildcardChars=*?{}\$; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=0; evaluatorListRefCount=0; wildcardEvaluatorListRefCount=0
14:11:18.429 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> RangerServiceDefHelper(). The RangerServiceDef: RangerServiceDef={id={7} guid={025a584e-60a1-4f68-b6e0-9926060801e1} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:10:43 UTC 2026} updateTime={Mon May 04 14:10:43 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }
14:11:18.429 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Created graph for resources: _nodes={keyname=[]}
14:11:18.429 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Returning sources: [keyname]
14:11:18.429 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Returning sinks: [keyname]
14:11:18.429 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Returning sources: [keyname]
14:11:18.429 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Returning sinks: [keyname]
14:11:18.429 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Created graph for resources: null
14:11:18.429 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Created graph for resources: null
14:11:18.430 [main] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- Found [3] resource hierarchies for service [kms] update-date[Mon May 04 14:10:43 UTC 2026]: {0=[[RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]], 1=[], 2=[]}
14:11:18.430 [main] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- PolicyEngine : No tag-policy-repository for service kms
14:11:18.430 [main] DEBUG org.apache.ranger.perf.policyengine.init -- [PERF]:main:RangerPolicyEngine.init(hashCode=2404aeee):68151769:72842130
14:11:18.430 [main] DEBUG org.apache.ranger.perf.policyengine.init -- In-Use memory: 190220248, Free memory:124352552
14:11:18.430 [main] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- <== PolicyEngine()
14:11:18.430 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$ServiceConfig)
14:11:18.430 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$ServiceConfig): calling childClassLoader.findClass()
14:11:18.430 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$ServiceConfig)
14:11:18.430 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$ServiceConfig): calling childClassLoader().findClass() 
14:11:18.430 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$ServiceConfig): class org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$ServiceConfig
14:11:18.430 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$ServiceConfig): class org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$ServiceConfig
14:11:18.430 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.service.RangerDefaultRequestProcessor)
14:11:18.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.service.RangerDefaultRequestProcessor): calling childClassLoader.findClass()
14:11:18.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.service.RangerDefaultRequestProcessor)
14:11:18.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.service.RangerDefaultRequestProcessor): calling childClassLoader().findClass() 
14:11:18.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.service.RangerDefaultRequestProcessor): class org.apache.ranger.plugin.service.RangerDefaultRequestProcessor
14:11:18.431 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.service.RangerDefaultRequestProcessor): class org.apache.ranger.plugin.service.RangerDefaultRequestProcessor
14:11:18.432 [main] INFO org.apache.ranger.plugin.service.RangerBasePlugin -- Switching policy engine from [-1]
14:11:18.432 [main] INFO org.apache.ranger.plugin.service.RangerBasePlugin -- Switched policy engine to [4]
14:11:18.432 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- ==> PolicyRefresher(serviceName=kms).saveToCache()
14:11:18.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.io.FileFilter)
14:11:18.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.io.FileFilter): calling childClassLoader.findClass()
14:11:18.466 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.io.FileFilter): interface java.io.FileFilter
14:11:18.467 [main] INFO org.apache.ranger.plugin.util.PolicyRefresher -- No files matching '.+json_*' found
14:11:18.467 [main] DEBUG org.apache.ranger.perf.policyengine.init -- [PERF]:main:PolicyRefresher.saveToCache(serviceName=kms):34237849:34723317
14:11:18.467 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- <== PolicyRefresher(serviceName=kms).saveToCache()
14:11:18.467 [main] DEBUG org.apache.ranger.plugin.service.RangerBasePlugin -- <== setPolicies(serviceName=kms, serviceId=3, policyVersion=4, policyUpdateTime=Mon May 04 14:11:01 UTC 2026, policies=[RangerPolicy={id={3} guid={1d0269a7-54a8-4091-8f87-b7e039293eab} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={1} service={kms} name={all} policyType={0} policyPriority={0} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={kms} resources={keyname={RangerPolicyResource={values={* } isExcludes={false} isRecursive={false} }} } additionalResources={} policyLabels={} policyConditions={} policyItems={RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}], tagPolicies=null, policyDeltas=null, serviceDef=RangerServiceDef={id={7} guid={025a584e-60a1-4f68-b6e0-9926060801e1} isEnabled={true} createdBy={null} updatedBy={null} createTime={Mon May 04 14:10:43 UTC 2026} updateTime={Mon May 04 14:10:43 UTC 2026} version={1} name={kms} displayName={kms} implClass={org.apache.ranger.services.kms.RangerServiceKMS} label={KMS} description={KMS} rbKeyLabel={null} rbKeyDescription={null} options={enableDenyAndExceptionsInPolicies=true enableTagBasedPolicies=true ui.pages=encryption security.allowed.roles=keyadmin } configs={RangerServiceConfigDef={itemId={provider} name={provider} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={KMS URL} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={username} name={username} type={string} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Username} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={password} name={password} type={password} subType={null} mandatory={true} defaultValue={null} validationRegEx={null} validationMessage={null} uiHint={null} label={Password} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }RangerServiceConfigDef={itemId={ranger.plugin.audit.filters} name={ranger.plugin.audit.filters} type={string} subType={null} mandatory={false} defaultValue={[ {'accessResult': 'DENIED', 'isAudited': true}, {'users':['keyadmin'] ,'isAudited':false} ]} validationRegEx={null} validationMessage={null} uiHint={null} label={Ranger Default Audit Filters} description={null} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} resources={RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }} accessTypes={RangerAccessTypeDef={itemId={1} name={create} label={Create} rbKeyLabel={null} impliedGrants={} category={CREATE} }RangerAccessTypeDef={itemId={2} name={delete} label={Delete} rbKeyLabel={null} impliedGrants={} category={DELETE} }RangerAccessTypeDef={itemId={3} name={rollover} label={Rollover} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={4} name={setkeymaterial} label={Set Key Material} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={5} name={get} label={Get} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={6} name={getkeys} label={Get Keys} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={7} name={getmetadata} label={Get Metadata} rbKeyLabel={null} impliedGrants={} category={READ} }RangerAccessTypeDef={itemId={8} name={generateeek} label={Generate EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }RangerAccessTypeDef={itemId={9} name={decrypteek} label={Decrypt EEK} rbKeyLabel={null} impliedGrants={} category={UPDATE} }} policyConditions={RangerPolicyConditionDef={itemId={1} name={_expression} evaluator={org.apache.ranger.plugin.conditionevaluator.RangerScriptConditionEvaluator} evaluatorOptions={{engineName=JavaScript, ui.isMultiline=true}} validationRegEx={null} validationMessage={null} uiHint={{ "isMultiline":true }} label={Enter boolean expression} description={Boolean expression} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} }} contextEnrichers={} enums={} dataMaskDef={RangerDataMaskDef={maskTypes={} accessTypes={} resources={} }} rowFilterDef={RangerRowFilterDef={accessTypes={} resources={} }} markerAccessTypes={RangerAccessTypeDef={itemId={10} name={_CREATE} label={_CREATE} rbKeyLabel={null} impliedGrants={create } category={null} }RangerAccessTypeDef={itemId={11} name={_READ} label={_READ} rbKeyLabel={null} impliedGrants={get getmetadata getkeys } category={null} }RangerAccessTypeDef={itemId={12} name={_UPDATE} label={_UPDATE} rbKeyLabel={null} impliedGrants={rollover setkeymaterial decrypteek generateeek } category={null} }RangerAccessTypeDef={itemId={13} name={_DELETE} label={_DELETE} rbKeyLabel={null} impliedGrants={delete } category={null} }RangerAccessTypeDef={itemId={14} name={_MANAGE} label={_MANAGE} rbKeyLabel={null} impliedGrants={} category={null} }RangerAccessTypeDef={itemId={15} name={_ALL} label={_ALL} rbKeyLabel={null} impliedGrants={get getmetadata create rollover setkeymaterial getkeys decrypteek generateeek delete } category={null} }} }, auditMode=audit-default, securityZones=null)
14:11:18.467 [main] DEBUG org.apache.ranger.perf.policyengine.init -- [PERF]:main:PolicyRefresher.loadPolicy(serviceName=kms):346509140:572732755
14:11:18.467 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- <== PolicyRefresher(serviceName=kms).loadPolicy()
14:11:18.468 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.Timer)
14:11:18.468 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.Timer): calling childClassLoader.findClass()
14:11:18.468 [PolicyRefresher(serviceName=kms)-24] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- ==> PolicyRefresher(serviceName=kms).run()
14:11:18.468 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.Timer): class java.util.Timer
14:11:18.469 [main] DEBUG org.apache.ranger.plugin.util.PolicyRefresher -- Scheduled policyDownloadRefresher to download policies every 30000 milliseconds
14:11:18.469 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.audit.RangerDefaultAuditHandler)
14:11:18.469 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.audit.RangerDefaultAuditHandler): calling childClassLoader.findClass()
14:11:18.469 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.audit.RangerDefaultAuditHandler)
14:11:18.469 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.audit.RangerDefaultAuditHandler): calling childClassLoader().findClass() 
14:11:18.470 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.audit.RangerDefaultAuditHandler): class org.apache.ranger.plugin.audit.RangerDefaultAuditHandler
14:11:18.470 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.audit.RangerDefaultAuditHandler): class org.apache.ranger.plugin.audit.RangerDefaultAuditHandler
14:11:18.470 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.model.AuditEventBase)
14:11:18.470 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.model.AuditEventBase): calling childClassLoader.findClass()
14:11:18.470 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.model.AuditEventBase)
14:11:18.470 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.model.AuditEventBase): calling childClassLoader().findClass() 
14:11:18.471 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.model.AuditEventBase): class org.apache.ranger.audit.model.AuditEventBase
14:11:18.471 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.model.AuditEventBase): class org.apache.ranger.audit.model.AuditEventBase
14:11:18.471 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.audit.model.AuthzAuditEvent)
14:11:18.471 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.audit.model.AuthzAuditEvent): calling childClassLoader.findClass()
14:11:18.471 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.audit.model.AuthzAuditEvent)
14:11:18.471 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.audit.model.AuthzAuditEvent): calling childClassLoader().findClass() 
14:11:18.471 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.audit.model.AuthzAuditEvent): class org.apache.ranger.audit.model.AuthzAuditEvent
14:11:18.471 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.audit.model.AuthzAuditEvent): class org.apache.ranger.audit.model.AuthzAuditEvent
14:11:18.472 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.init()
14:11:18.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:18.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:18.472 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.init()
14:11:18.472 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.RangerKmsAuthorizer()
14:11:18.472 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.startReloader()
14:11:18.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:18.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:18.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.Executors)
14:11:18.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.Executors): calling childClassLoader.findClass()
14:11:18.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.Executors): class java.util.concurrent.Executors
14:11:18.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.TimeUnit)
14:11:18.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.TimeUnit): calling childClassLoader.findClass()
14:11:18.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.TimeUnit): class java.util.concurrent.TimeUnit
14:11:18.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.util.concurrent.ScheduledExecutorService)
14:11:18.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.util.concurrent.ScheduledExecutorService): calling childClassLoader.findClass()
14:11:18.472 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.util.concurrent.ScheduledExecutorService): interface java.util.concurrent.ScheduledExecutorService
14:11:18.473 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:18.473 [main] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:18.473 [main] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.startReloader()
14:11:18.499 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSAudit -- No audit logger configured, using default.
14:11:18.500 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSAudit -- Initializing audit logger class org.apache.hadoop.crypto.key.kms.server.SimpleKMSAuditLogger
14:11:18.502 [main] INFO org.apache.ranger.kms.metrics.KMSMetricWrapper -- Creating KMSMetricWrapper with thread-safe value=false
14:11:18.505 [main] DEBUG org.apache.ranger.kms.metrics.KMSMetricWrapper -- ===>> KMSMetricWrapper.init()
14:11:18.507 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- from system property: null
14:11:18.508 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- from environment variable: null
14:11:18.582 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- Could not locate file hadoop-metrics2-kms.properties
org.apache.commons.configuration2.ex.ConfigurationException: Could not locate: FileLocator [basePath=null, encoding=null, fileName=hadoop-metrics2-kms.properties, fileSystem=null, locationStrategy=null, sourceURL=null, urlConnectionOptions=null]
	at org.apache.commons.configuration2.io.FileLocatorUtils.locateOrThrow(FileLocatorUtils.java:484)
	at org.apache.commons.configuration2.io.FileHandler.load(FileHandler.java:606)
	at org.apache.commons.configuration2.io.FileHandler.load(FileHandler.java:579)
	at org.apache.hadoop.metrics2.impl.MetricsConfig.loadFirst(MetricsConfig.java:118)
	at org.apache.hadoop.metrics2.impl.MetricsConfig.create(MetricsConfig.java:97)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.configure(MetricsSystemImpl.java:482)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.start(MetricsSystemImpl.java:188)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.init(MetricsSystemImpl.java:163)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.init(DefaultMetricsSystem.java:62)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.initialize(DefaultMetricsSystem.java:58)
	at org.apache.ranger.metrics.RangerMetricsSystemWrapper.init(RangerMetricsSystemWrapper.java:56)
	at org.apache.ranger.kms.metrics.KMSMetricWrapper.init(KMSMetricWrapper.java:80)
	at org.apache.ranger.kms.metrics.KMSMetricWrapper.<init>(KMSMetricWrapper.java:53)
	at org.apache.ranger.kms.metrics.KMSMetricWrapper.getInstance(KMSMetricWrapper.java:61)
	at org.apache.hadoop.crypto.key.kms.server.KMSWebApp.contextInitialized(KMSWebApp.java:168)
	at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4018)
	at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:4460)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1203)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1193)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:145)
	at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:749)
	at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:721)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1203)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1193)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:145)
	at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:749)
	at org.apache.catalina.core.StandardEngine.startInternal(StandardEngine.java:211)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.StandardService.startInternal(StandardService.java:415)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.StandardServer.startInternal(StandardServer.java:874)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.startup.Tomcat.start(Tomcat.java:438)
	at org.apache.ranger.server.tomcat.EmbeddedServer.startServer(EmbeddedServer.java:351)
	at org.apache.ranger.server.tomcat.EmbeddedServer.start(EmbeddedServer.java:317)
	at org.apache.ranger.server.tomcat.EmbeddedServer.main(EmbeddedServer.java:95)
14:11:18.583 [main] DEBUG org.apache.commons.configuration2.io.FileLocatorUtils -- Loading configuration from the context classpath (hadoop-metrics2.properties)
14:11:18.612 [main] INFO org.apache.hadoop.metrics2.impl.MetricsConfig -- Loaded properties from hadoop-metrics2.properties
14:11:18.614 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- Properties: *.period = 30

14:11:18.614 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- Metrics Config: 
14:11:18.616 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: period
14:11:18.617 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: periodMillis
14:11:18.620 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field org.apache.hadoop.metrics2.lib.MutableCounterLong org.apache.hadoop.metrics2.impl.MetricsSystemImpl.droppedPubAll with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Dropped updates by all sinks"})
14:11:18.621 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field org.apache.hadoop.metrics2.lib.MutableStat org.apache.hadoop.metrics2.impl.MetricsSystemImpl.publishStat with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Publish", "Publishing stats"})
14:11:18.621 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field org.apache.hadoop.metrics2.lib.MutableStat org.apache.hadoop.metrics2.impl.MetricsSystemImpl.snapshotStat with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Snapshot", "Snapshot stats"})
14:11:18.625 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: source.source.start_mbeans
14:11:18.625 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: source.start_mbeans
14:11:18.625 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.source.start_mbeans
14:11:18.632 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating attr cache...
14:11:18.633 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done. # tags & metrics=10
14:11:18.633 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating info cache...
14:11:18.633 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- [javax.management.MBeanAttributeInfo[description=Metrics context, name=tag.Context, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of active metrics sources, name=NumActiveSources, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of all registered metrics sources, name=NumAllSources, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of active metrics sinks, name=NumActiveSinks, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of all registered metrics sinks, name=NumAllSinks, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Dropped updates by all sinks, name=DroppedPubAll, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of ops for publishing stats, name=PublishNumOps, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Average time for publishing stats, name=PublishAvgTime, type=java.lang.Double, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of ops for snapshot stats, name=SnapshotNumOps, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Average time for snapshot stats, name=SnapshotAvgTime, type=java.lang.Double, read-only, descriptor={}]]
14:11:18.644 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done
14:11:18.644 [main] DEBUG org.apache.hadoop.metrics2.util.MBeans -- Registered Hadoop:service=kms,name=MetricsSystem,sub=Stats
14:11:18.644 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- MBean for source MetricsSystem,sub=Stats registered.
14:11:18.645 [main] INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Scheduled Metric snapshot period at 30 second(s).
14:11:18.645 [main] INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- kms metrics system started
14:11:18.645 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: source.source.start_mbeans
14:11:18.645 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: source.start_mbeans
14:11:18.645 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.source.start_mbeans
14:11:18.646 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating attr cache...
14:11:18.646 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done. # tags & metrics=10
14:11:18.646 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating info cache...
14:11:18.647 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- [javax.management.MBeanAttributeInfo[description=Metrics context, name=tag.Context, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Local hostname, name=tag.Hostname, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of ops for getGroups, name=GetGroupsNumOps, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Average time for getGroups, name=GetGroupsAvgTime, type=java.lang.Double, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of ops for rate of failed kerberos logins and latency (milliseconds), name=LoginFailureNumOps, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Average time for rate of failed kerberos logins and latency (milliseconds), name=LoginFailureAvgTime, type=java.lang.Double, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of ops for rate of successful kerberos logins and latency (milliseconds), name=LoginSuccessNumOps, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Average time for rate of successful kerberos logins and latency (milliseconds), name=LoginSuccessAvgTime, type=java.lang.Double, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Renewal failures since last successful login, name=RenewalFailures, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Renewal failures since startup, name=RenewalFailuresTotal, type=java.lang.Long, read-only, descriptor={}]]
14:11:18.647 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done
14:11:18.647 [main] DEBUG org.apache.hadoop.metrics2.util.MBeans -- Registered Hadoop:service=kms,name=UgiMetrics
14:11:18.647 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- MBean for source UgiMetrics registered.
14:11:18.647 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Registered source UgiMetrics
14:11:18.648 [main] DEBUG org.apache.hadoop.metrics2.util.MBeans -- Registered Hadoop:service=kms,name=MetricsSystem,sub=Control
14:11:18.650 [main] INFO org.apache.ranger.server.tomcat.EmbeddedServer -- Selected Tomcat protocolHandler: "http-nio-51381"
14:11:18.650 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- KMSMetricSource, KMS metrics
14:11:18.650 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: source.source.start_mbeans
14:11:18.650 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: source.start_mbeans
14:11:18.650 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.source.start_mbeans
14:11:18.652 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=KEY_CREATE_COUNT  , value=0 , type=COUNTER
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=KEY_CREATE_ELAPSED_TIME  , value=0 , type=GAUGE
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=EEK_DECRYPT_COUNT  , value=0 , type=COUNTER
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=EEK_DECRYPT_ELAPSED_TIME  , value=0 , type=GAUGE
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=EEK_GENERATE_COUNT  , value=0 , type=COUNTER
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=EEK_GENERATE_ELAPSED_TIME  , value=0 , type=GAUGE
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=EEK_REENCRYPT_COUNT  , value=0 , type=COUNTER
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=EEK_REENCRYPT_ELAPSED_TIME  , value=0 , type=GAUGE
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=REENCRYPT_EEK_BATCH_COUNT  , value=0 , type=COUNTER
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=REENCRYPT_EEK_BATCH_ELAPSED_TIME  , value=0 , type=GAUGE
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=REENCRYPT_EEK_BATCH_KEYS_COUNT  , value=0 , type=COUNTER
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=DELETE_KEY_COUNT  , value=0 , type=COUNTER
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=DELETE_KEY_ELAPSED_TIME  , value=0 , type=GAUGE
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=ROLL_NEW_VERSION_COUNT  , value=0 , type=COUNTER
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=ROLL_NEW_VERSION_ELAPSED_TIME  , value=0 , type=GAUGE
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=INVALIDATE_CACHE_COUNT  , value=0 , type=COUNTER
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=INVALIDATE_CACHE_ELAPSED_TIME  , value=0 , type=GAUGE
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_KEYS_METADATA_COUNT  , value=0 , type=COUNTER
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_KEYS_METADATA_ELAPSED_TIME  , value=0 , type=GAUGE
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_KEYS_METADATA_KEYNAMES_COUNT  , value=0 , type=COUNTER
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_KEYS_COUNT  , value=0 , type=COUNTER
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_KEYS_ELAPSED_TIME  , value=0 , type=GAUGE
14:11:18.653 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_METADATA_COUNT  , value=0 , type=COUNTER
14:11:18.654 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_METADATA_ELAPSED_TIME  , value=0 , type=GAUGE
14:11:18.654 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_CURRENT_KEY_COUNT  , value=0 , type=COUNTER
14:11:18.654 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_CURRENT_KEY_ELAPSED_TIME  , value=0 , type=GAUGE
14:11:18.654 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_KEY_VERSION_COUNT  , value=0 , type=COUNTER
14:11:18.654 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_KEY_VERSION_ELAPSED_TIME  , value=0 , type=GAUGE
14:11:18.654 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_KEY_VERSIONS_COUNT  , value=0 , type=COUNTER
14:11:18.654 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=GET_KEY_VERSIONS_ELAPSED_TIME  , value=0 , type=GAUGE
14:11:18.654 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=UNAUTHENTICATED_CALLS_COUNT  , value=0 , type=COUNTER
14:11:18.654 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=UNAUTHORIZED_CALLS_COUNT  , value=0 , type=COUNTER
14:11:18.654 [main] DEBUG org.apache.ranger.kms.metrics.source.KMSMetricSource -- KMSMetricSource: key=TOTAL_CALL_COUNT  , value=0 , type=COUNTER
14:11:18.654 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating attr cache...
14:11:18.654 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done. # tags & metrics=37
14:11:18.654 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating info cache...
14:11:18.654 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- [javax.management.MBeanAttributeInfo[description=Metrics context, name=tag.Context, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Local hostname, name=tag.Hostname, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=KEY_CREATE_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=KEY_CREATE_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=EEK_DECRYPT_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=EEK_DECRYPT_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=EEK_GENERATE_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=EEK_GENERATE_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=EEK_REENCRYPT_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=EEK_REENCRYPT_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=REENCRYPT_EEK_BATCH_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=REENCRYPT_EEK_BATCH_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=REENCRYPT_EEK_BATCH_KEYS_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=DELETE_KEY_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=DELETE_KEY_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=ROLL_NEW_VERSION_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=ROLL_NEW_VERSION_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=INVALIDATE_CACHE_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=INVALIDATE_CACHE_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_KEYS_METADATA_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_KEYS_METADATA_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_KEYS_METADATA_KEYNAMES_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_KEYS_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_KEYS_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_METADATA_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_METADATA_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_CURRENT_KEY_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_CURRENT_KEY_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_KEY_VERSION_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_KEY_VERSION_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_KEY_VERSIONS_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=GET_KEY_VERSIONS_ELAPSED_TIME, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=UNAUTHENTICATED_CALLS_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=UNAUTHORIZED_CALLS_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=TOTAL_CALL_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=UNAUTHENTICATED_CALLS_COUNT, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=, name=UNAUTHORIZED_CALLS_COUNT, type=java.lang.Long, read-only, descriptor={}]]
14:11:18.655 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done
14:11:18.655 [main] DEBUG org.apache.hadoop.metrics2.util.MBeans -- Registered Hadoop:service=kms,name=KMSMetricSource
14:11:18.655 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- MBean for source KMSMetricSource registered.
14:11:18.655 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Registered source KMSMetricSource
14:11:18.655 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- RangerJVM, Ranger common metric source (RangerMetricsJvmSource)
14:11:18.655 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: source.source.start_mbeans
14:11:18.655 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: source.start_mbeans
14:11:18.655 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.source.start_mbeans
14:11:18.659 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating attr cache...
14:11:18.659 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done. # tags & metrics=13
14:11:18.659 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating info cache...
14:11:18.659 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- [javax.management.MBeanAttributeInfo[description=Metrics context, name=tag.Context, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Local hostname, name=tag.Hostname, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger current memory utilization, name=MemoryCurrent, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger max memory utilization, name=MemoryMax, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger app total GCs, name=GcCountTotal, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger app total GC time, name=GcTimeTotal, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger app MAX GC time, name=GcTimeMax, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger busy threads, name=ThreadsBusy, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger blocked threads, name=ThreadsBlocked, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger waiting threads, name=ThreadsWaiting, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger remaining threads, name=ThreadsRemaining, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger Processors available, name=ProcessorsAvailable, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger System Load Average, name=SystemLoadAvg, type=java.lang.Float, read-only, descriptor={}]]
14:11:18.660 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done
14:11:18.660 [main] DEBUG org.apache.hadoop.metrics2.util.MBeans -- Registered Hadoop:service=kms,name=RangerJVM
14:11:18.660 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- MBean for source RangerJVM registered.
14:11:18.660 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Registered source RangerJVM
14:11:18.661 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- RangerContainer, Ranger web container metric source (RangerMetricsContainerSource)
14:11:18.661 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: source.source.start_mbeans
14:11:18.661 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: source.start_mbeans
14:11:18.661 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.source.start_mbeans
14:11:18.662 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating attr cache...
14:11:18.662 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done. # tags & metrics=11
14:11:18.662 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating info cache...
14:11:18.662 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- [javax.management.MBeanAttributeInfo[description=Metrics context, name=tag.Context, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Local hostname, name=tag.Hostname, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger max configured container connections, name=MaxConnectionsCount, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger active container connections, name=ActiveConnectionsCount, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger accept connections count, name=ConnectionAcceptCount, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger connection timeout, name=ConnectionTimeout, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger connection keepAlive timeout, name=KeepAliveTimeout, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger container worker threads count, name=MaxWorkerThreadsCount, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger container minimum spare worker threads count, name=MinSpareWorkerThreadsCount, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger container active worker threads count, name=ActiveWorkerThreadsCount, type=java.lang.Integer, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Ranger container total worker threads count, name=TotalWorkerThreadsCount, type=java.lang.Integer, read-only, descriptor={}]]
14:11:18.663 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done
14:11:18.663 [main] DEBUG org.apache.hadoop.metrics2.util.MBeans -- Registered Hadoop:service=kms,name=RangerContainer
14:11:18.663 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- MBean for source RangerContainer registered.
14:11:18.663 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Registered source RangerContainer
14:11:18.666 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Prometheus, Ranger common metric sink (RangerMetricsPrometheusSink)
14:11:18.667 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.context
14:11:18.667 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: context
14:11:18.667 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.context
14:11:18.667 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.period
14:11:18.667 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: period
14:11:18.667 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.period
14:11:18.667 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.queue.capacity
14:11:18.667 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: queue.capacity
14:11:18.667 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.queue.capacity
14:11:18.668 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.retry.delay
14:11:18.668 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: retry.delay
14:11:18.668 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.retry.delay
14:11:18.668 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.retry.backoff
14:11:18.668 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: retry.backoff
14:11:18.668 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.retry.backoff
14:11:18.668 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.retry.count
14:11:18.668 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: retry.count
14:11:18.668 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.retry.count
14:11:18.669 [main] INFO org.apache.hadoop.metrics2.impl.MetricsSinkAdapter -- Sink Prometheus started
14:11:18.670 [main] INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Registered sink Prometheus
14:11:18.670 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Json, Ranger common metric sink (RangerMetricsJsonSink)
14:11:18.671 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.context
14:11:18.671 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: context
14:11:18.671 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.context
14:11:18.671 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.period
14:11:18.671 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: period
14:11:18.671 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.period
14:11:18.671 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.queue.capacity
14:11:18.671 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: queue.capacity
14:11:18.671 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.queue.capacity
14:11:18.671 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.retry.delay
14:11:18.671 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: retry.delay
14:11:18.671 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.retry.delay
14:11:18.671 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.retry.backoff
14:11:18.671 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: retry.backoff
14:11:18.671 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.retry.backoff
14:11:18.671 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: sink.retry.count
14:11:18.671 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: retry.count
14:11:18.671 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.retry.count
14:11:18.672 [main] INFO org.apache.hadoop.metrics2.impl.MetricsSinkAdapter -- Sink Json started
14:11:18.672 [main] INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Registered sink Json
14:11:18.672 [main] INFO org.apache.ranger.metrics.RangerMetricsSystemWrapper -- ===>> Ranger Metric system initialized successfully.
14:11:18.672 [main] DEBUG org.apache.ranger.kms.metrics.KMSMetricWrapper -- <<=== KMSMetricWrapper.init()
14:11:18.672 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- ------------------ Ranger KMSWebApp---------------------
14:11:18.672 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- provider string = dbks://http@localhost:9292/kms
14:11:18.672 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- URI = dbks://http@localhost:9292/kms scheme = dbks
14:11:18.672 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- kmsconf size= 344 kms classname=org.apache.hadoop.conf.Configuration
14:11:18.672 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- ----------------Instantiating key provider ---------------
14:11:18.684 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> createProvider(dbks://http@localhost:9292/kms)
14:11:18.685 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> RangerKeyStoreProvider(conf)
14:11:18.685 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getConfiguration()
14:11:18.685 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getConfiguration()
14:11:18.686 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getFromJceks()
14:11:18.714 [main] DEBUG org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider -- backing jks path initialized to file:/root/ranger-2.1.0-kms/ews/webapp/WEB-INF/classes/conf/.jceks/rangerkms.jceks
14:11:18.733 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Starting: Acquiring creator semaphore for file:///root/ranger-2.1.0-kms/ews/webapp/WEB-INF/classes/conf/.jceks/rangerkms.jceks
14:11:18.733 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Acquiring creator semaphore for file:///root/ranger-2.1.0-kms/ews/webapp/WEB-INF/classes/conf/.jceks/rangerkms.jceks: duration 0:00.001s
14:11:18.737 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Starting: Creating FS file:///root/ranger-2.1.0-kms/ews/webapp/WEB-INF/classes/conf/.jceks/rangerkms.jceks
14:11:18.737 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Loading filesystems
14:11:18.753 [main] DEBUG org.apache.hadoop.fs.FileSystem -- file:// = class org.apache.hadoop.fs.LocalFileSystem from /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:11:18.761 [main] DEBUG org.apache.hadoop.fs.FileSystem -- viewfs:// = class org.apache.hadoop.fs.viewfs.ViewFileSystem from /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:11:18.764 [main] DEBUG org.apache.hadoop.fs.FileSystem -- har:// = class org.apache.hadoop.fs.HarFileSystem from /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:11:18.767 [main] DEBUG org.apache.hadoop.fs.FileSystem -- http:// = class org.apache.hadoop.fs.http.HttpFileSystem from /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:11:18.768 [main] DEBUG org.apache.hadoop.fs.FileSystem -- https:// = class org.apache.hadoop.fs.http.HttpsFileSystem from /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-kms/ews/webapp/WEB-INF/lib/hadoop-common-3.3.6.jar
14:11:18.769 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Looking for FS supporting file
14:11:18.769 [main] DEBUG org.apache.hadoop.fs.FileSystem -- looking for configuration option fs.file.impl
14:11:18.769 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Looking in service filesystems for implementation class
14:11:18.769 [main] DEBUG org.apache.hadoop.fs.FileSystem -- FS for file is class org.apache.hadoop.fs.LocalFileSystem
14:11:18.776 [main] DEBUG org.apache.hadoop.fs.FileSystem -- Creating FS file:///root/ranger-2.1.0-kms/ews/webapp/WEB-INF/classes/conf/.jceks/rangerkms.jceks: duration 0:00.039s
14:11:18.781 [main] INFO org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- Credential keystore password not applied for KMS; clear text password shall be applicable
14:11:18.781 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getFromJceks()
14:11:18.781 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getFromJceks()
14:11:18.785 [main] DEBUG org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider -- backing jks path initialized to file:/root/ranger-2.1.0-kms/ews/webapp/WEB-INF/classes/conf/.jceks/rangerkms.jceks
14:11:18.785 [main] INFO org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- Credential keystore password not applied for KMS; clear text password shall be applicable
14:11:18.786 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getFromJceks()
14:11:18.786 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getFromJceks()
14:11:18.788 [main] DEBUG org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider -- backing jks path initialized to file:/root/ranger-2.1.0-kms/ews/webapp/WEB-INF/classes/conf/.jceks/rangerkms.jceks
14:11:18.789 [main] INFO org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- Credential keystore password not applied for KMS; clear text password shall be applicable
14:11:18.789 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getFromJceks()
14:11:18.789 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getFromJceks()
14:11:18.792 [main] DEBUG org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider -- backing jks path initialized to file:/root/ranger-2.1.0-kms/ews/webapp/WEB-INF/classes/conf/.jceks/rangerkms.jceks
14:11:18.792 [main] INFO org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- Credential keystore password not applied for KMS; clear text password shall be applicable
14:11:18.792 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getFromJceks()
14:11:18.792 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getFromJceks()
14:11:18.792 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getFromJceks()
14:11:20.445 [main] INFO org.apache.hadoop.crypto.key.RangerKMSDB -- Connected to DB : false
14:11:20.446 [main] INFO org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- Ranger KMS Database is enabled for storing master key.
14:11:20.455 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> generateAndGetMasterKey()
14:11:20.455 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.generateMasterKey()
14:11:20.455 [main] INFO org.apache.hadoop.crypto.key.RangerMasterKey -- Generating Master Key...
14:11:20.455 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.init()
14:11:20.464 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.init()
14:11:20.573 [main] INFO org.apache.hadoop.crypto.key.RangerMasterKey -- Master Key doesn't exist in DB, Generating the Master Key
14:11:20.573 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.encryptMasterKey()
14:11:20.573 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.generateMasterKey()
14:11:20.573 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.getPBEParameterSpec()
14:11:20.574 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.encryptKey()
14:11:20.574 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.getPasswordKey()
14:11:20.574 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.getPasswordKey()
14:11:20.587 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.encryptKey()
14:11:20.587 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.encryptMasterKey()
14:11:20.594 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.saveEncryptedMK()
14:11:20.624 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.saveEncryptedMK()
14:11:20.624 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- Master Key Created with id = 1
14:11:20.624 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.generateMasterKey()
14:11:20.624 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.getMasterKey()
14:11:20.624 [main] INFO org.apache.hadoop.crypto.key.RangerMasterKey -- Getting Master Key
14:11:20.624 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.getEncryptedMK()
14:11:20.642 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.getEncryptedMK()
14:11:20.645 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.getMasterKey()
14:11:20.645 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.decryptMasterKey()
14:11:20.645 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- Decrypting Master Key...
14:11:20.645 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.getPBEParameterSpec()
14:11:20.645 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- ==> RangerMasterKey.getPasswordKey()
14:11:20.645 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.getPasswordKey()
14:11:20.647 [main] DEBUG org.apache.hadoop.crypto.key.RangerMasterKey -- <== RangerMasterKey.decryptMasterKey()
14:11:20.647 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== generateAndGetMasterKey()
14:11:20.647 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> reloadKeys()
14:11:20.648 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> loadKeys()
14:11:20.648 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> engineLoad()
14:11:20.648 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> dbOperationLoad()
14:11:20.652 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== dbOperationLoad(): count=0
14:11:20.652 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- RangerKeyStore might be null or key is not present in the database.
14:11:20.652 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== loadKeys()
14:11:20.652 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== reloadKeys()
14:11:20.652 [main] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== createProvider(dbks://http@localhost:9292/kms)
14:11:20.652 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- keyProvider = org.apache.hadoop.crypto.key.RangerKeyStoreProvider@52d93447
14:11:20.657 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- Initialized KeyProvider CachingKeyProvider: org.apache.hadoop.crypto.key.RangerKeyStoreProvider@52d93447
14:11:20.667 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- Initialized KeyProviderCryptoExtension org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider:EagerKeyGeneratorKeyProviderCryptoExtension: KeyProviderCryptoExtension: CachingKeyProvider: org.apache.hadoop.crypto.key.RangerKeyStoreProvider@52d93447
14:11:20.667 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- Default key bitlength is 128
14:11:20.667 [main] INFO org.apache.hadoop.crypto.key.kms.server.KMSWebApp -- Ranger KMS Started
14:11:20.722 [main] DEBUG org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager$DelegationTokenSecretManagerMetrics -- Initialized MetricsRegistry{info=MetricsInfoImpl{name=DelegationTokenSecretManagerMetrics, description=DelegationTokenSecretManagerMetrics}, tags=[], metrics=[]}
14:11:20.723 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field private org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager$DelegationTokenSecretManagerMetrics.removeToken with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Rate of removal of delegation tokens and latency (milliseconds)"})
14:11:20.723 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field private org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager$DelegationTokenSecretManagerMetrics.storeToken with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Rate of storage of delegation tokens and latency (milliseconds)"})
14:11:20.723 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field private org.apache.hadoop.metrics2.lib.MutableCounterLong org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager$DelegationTokenSecretManagerMetrics.tokenFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Counter of delegation tokens operation failures"})
14:11:20.724 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory -- field private org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager$DelegationTokenSecretManagerMetrics.updateToken with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Rate of update of delegation tokens and latency (milliseconds)"})
14:11:20.725 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- DelegationTokenSecretManagerMetrics, Delegation token secret manager metrics
14:11:20.725 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: source.source.start_mbeans
14:11:20.725 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'MetricsConfig' for key: source.start_mbeans
14:11:20.725 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig -- poking parent 'PropertiesConfiguration' for key: *.source.start_mbeans
14:11:20.725 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating attr cache...
14:11:20.725 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done. # tags & metrics=9
14:11:20.725 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Updating info cache...
14:11:20.725 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- [javax.management.MBeanAttributeInfo[description=Metrics context, name=tag.Context, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Local hostname, name=tag.Hostname, type=java.lang.String, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of ops for rate of removal of delegation tokens and latency (milliseconds), name=RemoveTokenNumOps, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Average time for rate of removal of delegation tokens and latency (milliseconds), name=RemoveTokenAvgTime, type=java.lang.Double, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of ops for rate of storage of delegation tokens and latency (milliseconds), name=StoreTokenNumOps, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Average time for rate of storage of delegation tokens and latency (milliseconds), name=StoreTokenAvgTime, type=java.lang.Double, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Counter of delegation tokens operation failures, name=TokenFailure, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Number of ops for rate of update of delegation tokens and latency (milliseconds), name=UpdateTokenNumOps, type=java.lang.Long, read-only, descriptor={}], javax.management.MBeanAttributeInfo[description=Average time for rate of update of delegation tokens and latency (milliseconds), name=UpdateTokenAvgTime, type=java.lang.Double, read-only, descriptor={}]]
14:11:20.725 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- Done
14:11:20.725 [main] DEBUG org.apache.hadoop.metrics2.util.MBeans -- Registered Hadoop:service=kms,name=DelegationTokenSecretManagerMetrics
14:11:20.725 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter -- MBean for source DelegationTokenSecretManagerMetrics registered.
14:11:20.725 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl -- Registered source DelegationTokenSecretManagerMetrics
14:11:20.729 [main] INFO org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler -- Using keytab /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/HTTP_127.25.254.212@KRBTEST.COM.keytab, for principal HTTP/127.25.254.212@KRBTEST.COM
14:11:20.735 [main] INFO org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager -- Updating the current master key for generating delegation tokens
14:11:20.738 [Thread[Thread-14,5,main]] INFO org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager -- Starting expired delegation token remover thread, tokenRemoverScanInterval=60 min(s)
14:11:20.738 [Thread[Thread-14,5,main]] INFO org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager -- Updating the current master key for generating delegation tokens
14:11:20.754 [main] INFO com.sun.jersey.api.core.PackagesResourceConfig -- Scanning for root resource and provider classes in the packages:
  org.apache.hadoop.crypto.key.kms.server
14:11:20.770 [main] INFO com.sun.jersey.api.core.ScanningResourceConfig -- Root resource classes found:
  class org.apache.hadoop.crypto.key.kms.server.MetricREST
  class org.apache.hadoop.crypto.key.kms.server.RangerKMSRestApi
  class org.apache.hadoop.crypto.key.kms.server.KMS
14:11:20.770 [main] INFO com.sun.jersey.api.core.ScanningResourceConfig -- Provider classes found:
  class org.apache.hadoop.crypto.key.kms.server.KMSJSONReader
  class org.apache.hadoop.crypto.key.kms.server.KMSExceptionsProvider
  class org.apache.hadoop.crypto.key.kms.server.KMSJSONWriter
14:11:20.831 [main] INFO com.sun.jersey.server.impl.application.WebApplicationImpl -- Initiating Jersey application, version 'Jersey: 1.19.4 05/24/2017 03:46 PM'
14:11:21.105 [main] ERROR com.sun.jersey.server.impl.wadl.WadlApplicationContextImpl -- Error while searching for service [javax.xml.bind.JAXBContextFactory]
javax.xml.bind.JAXBException: Error while searching for service [javax.xml.bind.JAXBContextFactory]
	at javax.xml.bind.ContextFinder$1.createException(ContextFinder.java:118)
	at javax.xml.bind.ContextFinder$1.createException(ContextFinder.java:115)
	at javax.xml.bind.ServiceLoaderUtil.firstByServiceLoader(ServiceLoaderUtil.java:76)
	at javax.xml.bind.ContextFinder.find(ContextFinder.java:343)
	at javax.xml.bind.JAXBContext.newInstance(JAXBContext.java:508)
	at javax.xml.bind.JAXBContext.newInstance(JAXBContext.java:465)
	at javax.xml.bind.JAXBContext.newInstance(JAXBContext.java:366)
	at com.sun.jersey.server.impl.wadl.WadlApplicationContextImpl.<init>(WadlApplicationContextImpl.java:107)
	at com.sun.jersey.server.impl.wadl.WadlFactory.init(WadlFactory.java:100)
	at com.sun.jersey.server.impl.application.RootResourceUriRules.initWadl(RootResourceUriRules.java:169)
	at com.sun.jersey.server.impl.application.RootResourceUriRules.<init>(RootResourceUriRules.java:106)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._initiate(WebApplicationImpl.java:1359)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.access$700(WebApplicationImpl.java:180)
	at com.sun.jersey.server.impl.application.WebApplicationImpl$13.f(WebApplicationImpl.java:799)
	at com.sun.jersey.server.impl.application.WebApplicationImpl$13.f(WebApplicationImpl.java:795)
	at com.sun.jersey.spi.inject.Errors.processWithErrors(Errors.java:193)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.initiate(WebApplicationImpl.java:795)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.initiate(WebApplicationImpl.java:790)
	at com.sun.jersey.spi.container.servlet.ServletContainer.initiate(ServletContainer.java:509)
	at com.sun.jersey.spi.container.servlet.ServletContainer$InternalWebComponent.initiate(ServletContainer.java:339)
	at com.sun.jersey.spi.container.servlet.WebComponent.load(WebComponent.java:605)
	at com.sun.jersey.spi.container.servlet.WebComponent.init(WebComponent.java:207)
	at com.sun.jersey.spi.container.servlet.ServletContainer.init(ServletContainer.java:394)
	at com.sun.jersey.spi.container.servlet.ServletContainer.init(ServletContainer.java:577)
	at javax.servlet.GenericServlet.init(GenericServlet.java:143)
	at org.apache.catalina.core.StandardWrapper.initServlet(StandardWrapper.java:984)
	at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:941)
	at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:838)
	at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:4193)
	at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:4494)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1203)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1193)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:145)
	at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:749)
	at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:721)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1203)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1193)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:145)
	at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:749)
	at org.apache.catalina.core.StandardEngine.startInternal(StandardEngine.java:211)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.StandardService.startInternal(StandardService.java:415)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.core.StandardServer.startInternal(StandardServer.java:874)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164)
	at org.apache.catalina.startup.Tomcat.start(Tomcat.java:438)
	at org.apache.ranger.server.tomcat.EmbeddedServer.startServer(EmbeddedServer.java:351)
	at org.apache.ranger.server.tomcat.EmbeddedServer.start(EmbeddedServer.java:317)
	at org.apache.ranger.server.tomcat.EmbeddedServer.main(EmbeddedServer.java:95)
Caused by: java.util.ServiceConfigurationError: javax.xml.bind.JAXBContextFactory: com.sun.xml.bind.v2.JAXBContextFactory not a subtype
	at java.base/java.util.ServiceLoader.fail(ServiceLoader.java:593)
	at java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNextService(ServiceLoader.java:1244)
	at java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNext(ServiceLoader.java:1273)
	at java.base/java.util.ServiceLoader$2.hasNext(ServiceLoader.java:1309)
	at java.base/java.util.ServiceLoader$3.hasNext(ServiceLoader.java:1393)
	at javax.xml.bind.ServiceLoaderUtil.firstByServiceLoader(ServiceLoaderUtil.java:69)
	... 52 common frames omitted
14:11:21.187 [main] INFO org.apache.coyote.http11.Http11NioProtocol -- Starting ProtocolHandler ["http-nio-51381"]
14:11:21.283 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Got token null from httpRequest http://127.25.254.212:51381/kms/v1/keys
14:11:21.352 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:51381/kms/v1/keys] triggering authentication. handler: class org.apache.hadoop.security.token.delegation.web.KerberosDelegationTokenAuthenticationHandler
14:11:21.352 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationHandler -- Falling back to class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler (req=org.apache.catalina.connector.RequestFacade@2ebcddf9)
14:11:21.380 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:51381/kms/v1/keys] user [keyadmin] authenticated
14:11:21.411 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- ==> createKey()
14:11:21.416 [http-nio-51381-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, keyadmin@KRBTEST.COM (auth:KERBEROS), CREATE)
14:11:21.416 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:21.416 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:21.416 [http-nio-51381-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, keyadmin@KRBTEST.COM (auth:KERBEROS), CREATE)
14:11:21.416 [http-nio-51381-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(CREATE, keyadmin@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey)
14:11:21.416 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest)
14:11:21.416 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest): calling childClassLoader.findClass()
14:11:21.416 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest)
14:11:21.416 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest): calling childClassLoader().findClass() 
14:11:21.416 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestImpl)
14:11:21.417 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestImpl): calling childClassLoader.findClass()
14:11:21.417 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestImpl)
14:11:21.417 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestImpl): calling childClassLoader().findClass() 
14:11:21.417 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestImpl): class org.apache.ranger.plugin.policyengine.RangerAccessRequestImpl
14:11:21.417 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestImpl): class org.apache.ranger.plugin.policyengine.RangerAccessRequestImpl
14:11:21.418 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest): class org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest
14:11:21.418 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest): class org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest
14:11:21.418 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceMatchingScope)
14:11:21.418 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceMatchingScope): calling childClassLoader.findClass()
14:11:21.418 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceMatchingScope)
14:11:21.418 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceMatchingScope): calling childClassLoader().findClass() 
14:11:21.418 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceMatchingScope): class org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceMatchingScope
14:11:21.418 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceMatchingScope): class org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceMatchingScope
14:11:21.419 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerAccessRequestUtil)
14:11:21.419 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerAccessRequestUtil): calling childClassLoader.findClass()
14:11:21.419 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerAccessRequestUtil)
14:11:21.419 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerAccessRequestUtil): calling childClassLoader().findClass() 
14:11:21.420 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerAccessRequestUtil): class org.apache.ranger.plugin.util.RangerAccessRequestUtil
14:11:21.420 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerAccessRequestUtil): class org.apache.ranger.plugin.util.RangerAccessRequestUtil
14:11:21.420 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSResource)
14:11:21.420 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSResource): calling childClassLoader.findClass()
14:11:21.420 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSResource)
14:11:21.420 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSResource): calling childClassLoader().findClass() 
14:11:21.420 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResourceImpl)
14:11:21.421 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResourceImpl): calling childClassLoader.findClass()
14:11:21.421 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResourceImpl)
14:11:21.421 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResourceImpl): calling childClassLoader().findClass() 
14:11:21.421 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerMutableResource)
14:11:21.421 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerMutableResource): calling childClassLoader.findClass()
14:11:21.421 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerMutableResource)
14:11:21.421 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerMutableResource): calling childClassLoader().findClass() 
14:11:21.421 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerMutableResource): interface org.apache.ranger.plugin.policyengine.RangerMutableResource
14:11:21.421 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerMutableResource): interface org.apache.ranger.plugin.policyengine.RangerMutableResource
14:11:21.421 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResourceImpl): class org.apache.ranger.plugin.policyengine.RangerAccessResourceImpl
14:11:21.422 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResourceImpl): class org.apache.ranger.plugin.policyengine.RangerAccessResourceImpl
14:11:21.422 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSResource): class org.apache.ranger.authorization.kms.authorizer.RangerKMSResource
14:11:21.422 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.authorization.kms.authorizer.RangerKMSResource): class org.apache.ranger.authorization.kms.authorizer.RangerKMSResource
14:11:21.443 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.security.Groups -- GroupCacheLoader - load.
14:11:21.451 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.security.UserGroupInformation -- Failed to get groups for user keyadmin
java.io.IOException: No groups found for user keyadmin
	at org.apache.hadoop.security.Groups.noGroupsForUser(Groups.java:200)
	at org.apache.hadoop.security.Groups.access$400(Groups.java:75)
	at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:334)
	at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:270)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3529)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2278)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2155)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2045)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache.get(LocalCache.java:3962)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3985)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4946)
	at org.apache.hadoop.security.Groups.getGroups(Groups.java:228)
	at org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1755)
	at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1743)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest.<init>(RangerKmsAuthorizer.java:367)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:247)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:266)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:164)
	at org.apache.hadoop.crypto.key.kms.server.KMS.assertAccess(KMS.java:745)
	at org.apache.hadoop.crypto.key.kms.server.KMS.createKey(KMS.java:121)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:11:21.452 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.hadoop.thirdparty.com.google.common.collect.Sets)
14:11:21.452 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.thirdparty.com.google.common.collect.Sets): calling childClassLoader.findClass()
14:11:21.452 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.hadoop.thirdparty.com.google.common.collect.Sets)
14:11:21.452 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.thirdparty.com.google.common.collect.Sets): calling childClassLoader().findClass() 
14:11:21.452 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.hadoop.thirdparty.com.google.common.collect.Sets): calling componentClassLoader.findClass()
14:11:21.452 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.hadoop.thirdparty.com.google.common.collect.Sets): calling componentClassLoader.loadClass()
14:11:21.452 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.hadoop.thirdparty.com.google.common.collect.Sets): class org.apache.hadoop.thirdparty.com.google.common.collect.Sets
14:11:21.453 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } }, policyType=0)
14:11:21.453 [http-nio-51381-exec-1] INFO org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- RangerPolicyEngineImpl.evaluatePolicies(40e6fd8f_0, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:11:21.453 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- ==> preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:11:21.453 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:11:21.453 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:11:21.453 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- getMatchedZonesForResourceAndChildren(resource=RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:11:21.454 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- No context-enrichers!!!
14:11:21.454 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- <== preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:11:21.454 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0)
14:11:21.454 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- zoneNames:[null]
14:11:21.454 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- zoneName:[null]
14:11:21.454 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:11:21.454 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:11:21.454 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResult)
14:11:21.454 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResult): calling childClassLoader.findClass()
14:11:21.454 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResult)
14:11:21.454 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResult): calling childClassLoader().findClass() 
14:11:21.455 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessResult): class org.apache.ranger.plugin.policyengine.RangerAccessResult
14:11:21.455 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessResult): class org.apache.ranger.plugin.policyengine.RangerAccessResult
14:11:21.455 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$1)
14:11:21.455 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$1): calling childClassLoader.findClass()
14:11:21.455 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$1)
14:11:21.455 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$1): calling childClassLoader().findClass() 
14:11:21.456 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$1): class org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$1
14:11:21.456 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$1): class org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl$1
14:11:21.456 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(java.lang.NoSuchFieldError)
14:11:21.456 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(java.lang.NoSuchFieldError): calling childClassLoader.findClass()
14:11:21.456 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(java.lang.NoSuchFieldError): class java.lang.NoSuchFieldError
14:11:21.456 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:21.456 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:21.456 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.setAuditEnabledFromCache()
14:11:21.456 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.setAuditEnabledFromCache():false
14:11:21.457 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever)
14:11:21.457 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever): calling childClassLoader.findClass()
14:11:21.457 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever)
14:11:21.457 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever): calling childClassLoader().findClass() 
14:11:21.457 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever): class org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever
14:11:21.457 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever): class org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever
14:11:21.457 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:11:21.458 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector)
14:11:21.458 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector): calling childClassLoader.findClass()
14:11:21.458 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector)
14:11:21.458 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector): calling childClassLoader().findClass() 
14:11:21.458 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector): class org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector
14:11:21.458 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector): class org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector
14:11:21.458 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:11:21.458 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchingScope)
14:11:21.459 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchingScope): calling childClassLoader.findClass()
14:11:21.459 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchingScope)
14:11:21.459 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchingScope): calling childClassLoader().findClass() 
14:11:21.459 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchingScope): class org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchingScope
14:11:21.459 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchingScope): class org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchingScope
14:11:21.459 [http-nio-51381-exec-1] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-51381-exec-1:RangerResourceTrie.traverse(resource=kuduclusterkey):502421:618139
14:11:21.459 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@136648ef
14:11:21.459 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@b96b45e]]
14:11:21.459 [http-nio-51381-exec-1] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-51381-exec-1:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):2116222:2629691
14:11:21.459 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=1
14:11:21.459 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:11:21 UTC 2026)
14:11:21.459 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:11:21 UTC 2026) : true
14:11:21.460 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:21.460 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerTagAccessRequest)
14:11:21.460 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerTagAccessRequest): calling childClassLoader.findClass()
14:11:21.460 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerTagAccessRequest)
14:11:21.460 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerTagAccessRequest): calling childClassLoader().findClass() 
14:11:21.460 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerTagAccessRequest): class org.apache.ranger.plugin.policyengine.RangerTagAccessRequest
14:11:21.460 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerTagAccessRequest): class org.apache.ranger.plugin.policyengine.RangerTagAccessRequest
14:11:21.460 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:21.461 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher$MatchType)
14:11:21.461 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher$MatchType): calling childClassLoader.findClass()
14:11:21.461 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher$MatchType)
14:11:21.461 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher$MatchType): calling childClassLoader().findClass() 
14:11:21.461 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher$MatchType): class org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher$MatchType
14:11:21.461 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher$MatchType): class org.apache.ranger.plugin.policyresourcematcher.RangerPolicyResourceMatcher$MatchType
14:11:21.461 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:11:21.461 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:11:21.461 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:11:21.462 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:11:21.462 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:11:21.462 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:11:21.462 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:21.462 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:21.462 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchType)
14:11:21.462 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchType): calling childClassLoader.findClass()
14:11:21.462 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchType)
14:11:21.462 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchType): calling childClassLoader().findClass() 
14:11:21.462 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchType): class org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchType
14:11:21.462 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchType): class org.apache.ranger.plugin.policyengine.RangerAccessRequest$ResourceElementMatchType
14:11:21.463 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- isAllValuesRequested(kuduclusterkey): false
14:11:21.463 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:11:21.463 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$1)
14:11:21.463 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$1): calling childClassLoader.findClass()
14:11:21.463 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$1)
14:11:21.463 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$1): calling childClassLoader().findClass() 
14:11:21.463 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$1): class org.apache.ranger.plugin.resourcematcher.ResourceMatcher$1
14:11:21.463 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.resourcematcher.ResourceMatcher$1): class org.apache.ranger.plugin.resourcematcher.ResourceMatcher$1
14:11:21.463 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): true
14:11:21.463 [http-nio-51381-exec-1] DEBUG org.apache.ranger.perf.policyresourcematcher.match -- [PERF]:http-nio-51381-exec-1:RangerDefaultPolicyResourceMatcher.getMatchType():1640622:2162801
14:11:21.464 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:11:21.464 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.matchPolicyCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:11:21.464 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.matchCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): true
14:11:21.464 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}, SELF)
14:11:21.464 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- Checking for accessType:[create]
14:11:21.464 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper)
14:11:21.464 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper): calling childClassLoader.findClass()
14:11:21.464 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper)
14:11:21.464 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper): calling childClassLoader().findClass() 
14:11:21.464 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper): class org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper
14:11:21.465 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper): class org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper
14:11:21.465 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@76a7e46)
14:11:21.465 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@76a7e46): null
14:11:21.465 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@76a7e46)
14:11:21.465 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@76a7e46)
14:11:21.465 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@76a7e46)
14:11:21.465 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, keyadmin, [], null, null)
14:11:21.465 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, keyadmin, [], null, null): true
14:11:21.465 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@76a7e46): true
14:11:21.465 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@76a7e46)
14:11:21.466 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@76a7e46): true
14:11:21.466 [http-nio-51381-exec-1] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-51381-exec-1:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):467735:665888
14:11:21.466 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@76a7e46): true
14:11:21.466 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@76a7e46): org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator@39da6daa
14:11:21.466 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:11:21.466 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:11:21.466 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{create=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF)
14:11:21.466 [http-nio-51381-exec-1] DEBUG org.apache.ranger.perf.policy.request -- [PERF]:http-nio-51381-exec-1:RangerPolicyEvaluator.evaluate(requestHashCode=40e6fd8f,policyId=3, policyName=all):4999563:6494526
14:11:21.466 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{create=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:21.466 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.storeAuditEnabledInCache()
14:11:21.466 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.storeAuditEnabledInCache()
14:11:21.466 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:21.466 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:21.467 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:21.467 [http-nio-51381-exec-1] DEBUG org.apache.ranger.perf.policyengine.request -- [PERF]:http-nio-51381-exec-1:RangerPolicyEngine.evaluatePolicies(requestHashCode=40e6fd8f_0):10738273:13679995
14:11:21.467 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType=0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:21.467 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:21.467 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:21.467 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:11:21.467 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:11:21.467 [http-nio-51381-exec-1] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-51381-exec-1:RangerResourceTrie.traverse(resource=kuduclusterkey):31087:31557
14:11:21.467 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@676f6a5c
14:11:21.467 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@5a92a871, org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@559c91e1]]
14:11:21.467 [http-nio-51381-exec-1] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-51381-exec-1:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):267654:391051
14:11:21.467 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=2
14:11:21.467 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:11:21.468 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:21.468 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:21.468 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$1)
14:11:21.468 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$1): calling childClassLoader.findClass()
14:11:21.468 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$1)
14:11:21.468 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$1): calling childClassLoader().findClass() 
14:11:21.468 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.findClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$1): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$1
14:11:21.468 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.loadClass(org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$1): class org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator$1
14:11:21.468 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:11:21.468 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:11:21.468 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:21.469 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:21.469 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:11:21.469 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.469 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.469 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:11:21.469 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchUserGroupRole(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=true
14:11:21.469 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAction(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=true
14:11:21.469 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:11:21.469 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.469 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.469 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={create} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={create} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:11:21.469 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.469 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.470 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.470 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): null
14:11:21.470 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.logAuthzAudit(null)
14:11:21.470 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.logAuthzAudit(null)
14:11:21.470 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.470 [http-nio-51381-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.hasAccess(CREATE, keyadmin@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey): true
14:11:21.470 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:21.470 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:21.470 [http-nio-51381-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.assertAccess(kuduclusterkey, keyadmin@KRBTEST.COM (auth:KERBEROS), CREATE)
14:11:21.470 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- Creating key: name=kuduclusterkey, cipher=AES/CTR/NoPadding, keyLength=128, description=kuduclusterkey
14:11:21.471 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: keyadmin@KRBTEST.COM (auth:KERBEROS)][action: org.apache.hadoop.crypto.key.kms.server.KMS$$Lambda$229/0x00007f3d08613a80@17b20255]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.hadoop.crypto.key.kms.server.KMS.createKey(KMS.java:149)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:11:21.473 [http-nio-51381-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:11:21.473 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:21.473 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:21.473 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:21.473 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:21.473 [http-nio-51381-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:11:21.493 [http-nio-51381-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, keyadmin@KRBTEST.COM (auth:KERBEROS), MANAGEMENT)
14:11:21.493 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:21.494 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:21.494 [http-nio-51381-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, keyadmin@KRBTEST.COM (auth:KERBEROS), MANAGEMENT)
14:11:21.494 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:21.494 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:21.494 [http-nio-51381-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, keyadmin@KRBTEST.COM (auth:KERBEROS), MANAGEMENT)
14:11:21.494 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> createKey(kuduclusterkey)
14:11:21.494 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> reloadKeys()
14:11:21.494 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> loadKeys()
14:11:21.494 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> engineLoad()
14:11:21.494 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> dbOperationLoad()
14:11:21.497 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== dbOperationLoad(): count=0
14:11:21.497 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- RangerKeyStore might be null or key is not present in the database.
14:11:21.497 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== loadKeys()
14:11:21.497 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== reloadKeys()
14:11:21.497 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== engineContainsAlias(kuduclusterkey): ret=false
14:11:21.497 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> innerSetKeyVersion(name=kuduclusterkey, versionName=kuduclusterkey@0)
14:11:21.499 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> addKeyEntry(kuduclusterkey)
14:11:21.500 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> sealKey()
14:11:21.511 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== sealKey(): ret=org.apache.hadoop.crypto.key.RangerKeyStore$RangerSealedObject@7f392a9b
14:11:21.512 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== addKeyEntry(kuduclusterkey)
14:11:21.512 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> addKeyEntry(kuduclusterkey@0)
14:11:21.512 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> sealKey()
14:11:21.515 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== sealKey(): ret=org.apache.hadoop.crypto.key.RangerKeyStore$RangerSealedObject@179a37f0
14:11:21.515 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== addKeyEntry(kuduclusterkey@0)
14:11:21.516 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== innerSetKeyVersion(name=kuduclusterkey, versionName=kuduclusterkey@0): ret=key(kuduclusterkey@0)= 05 4d 33 cd 46 54 c5 f0 6c 6b 30 1c 23 ac 2d 76
14:11:21.516 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== createKey(kuduclusterkey)
14:11:21.516 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> flush()
14:11:21.516 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> engineStore()
14:11:21.527 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> dbOperationStore(kuduclusterkey)
14:11:21.550 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== dbOperationStore(kuduclusterkey)
14:11:21.550 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> dbOperationStore(kuduclusterkey@0)
14:11:21.557 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== dbOperationStore(kuduclusterkey@0)
14:11:21.557 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== engineStore()
14:11:21.557 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> reloadKeys()
14:11:21.557 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> loadKeys()
14:11:21.557 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> engineLoad()
14:11:21.557 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> dbOperationLoad()
14:11:21.559 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== dbOperationLoad(): count=2
14:11:21.565 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- engineLoad(): loaded key kuduclusterkey
14:11:21.565 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- engineLoad(): loaded key kuduclusterkey@0
14:11:21.565 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- engineLoad(): loaded 2 keys
14:11:21.565 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- engineLoad(): keyEntries switched with 2 keys
14:11:21.566 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== loadKeys()
14:11:21.566 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== reloadKeys()
14:11:21.566 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== flush()
14:11:21.567 [http-nio-51381-exec-1] INFO kms-audit -- OK[op=CREATE_KEY, key=kuduclusterkey, user=keyadmin@KRBTEST.COM] UserProvidedMaterial:false Description:kuduclusterkey
14:11:21.568 [http-nio-51381-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(GET, keyadmin@KRBTEST.COM (auth:KERBEROS))
14:11:21.568 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:21.568 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:21.568 [http-nio-51381-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(GET, keyadmin@KRBTEST.COM (auth:KERBEROS))
14:11:21.568 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.security.UserGroupInformation -- Failed to get groups for user keyadmin
java.io.IOException: No groups found for user keyadmin
	at org.apache.hadoop.security.Groups.noGroupsForUser(Groups.java:200)
	at org.apache.hadoop.security.Groups.getGroups(Groups.java:223)
	at org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1755)
	at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1743)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest.<init>(RangerKmsAuthorizer.java:367)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:221)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:143)
	at org.apache.hadoop.crypto.key.kms.server.KMS.createKey(KMS.java:165)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:11:21.569 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } }, policyType=0)
14:11:21.569 [http-nio-51381-exec-1] INFO org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- RangerPolicyEngineImpl.evaluatePolicies(5dc1901e_0, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:11:21.569 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- ==> preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:11:21.569 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=; } })
14:11:21.569 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=; } }): ret=null
14:11:21.569 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- getMatchedZonesForResourceAndChildren(resource=RangerResourceImpl={ownerUser={null} elements={keyname=; } }): ret=null
14:11:21.569 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- No context-enrichers!!!
14:11:21.569 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- <== preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:11:21.569 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0)
14:11:21.569 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- zoneNames:[null]
14:11:21.569 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- zoneName:[null]
14:11:21.569 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:11:21.569 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:11:21.569 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:21.570 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:21.570 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.setAuditEnabledFromCache()
14:11:21.570 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.setAuditEnabledFromCache():false
14:11:21.570 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:11:21 UTC 2026)
14:11:21.570 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:11:21 UTC 2026) : true
14:11:21.570 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:21.570 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=; } }{token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:21.570 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=; } })
14:11:21.570 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:11:21.570 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:11:21.570 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:11:21.570 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:11:21.570 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=; } }): [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:11:21.570 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.isMatch(, {token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:21.570 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.getMatchType(, {token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:21.570 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- isAllValuesRequested(): true
14:11:21.570 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.getMatchType(, {token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:11:21.570 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.isMatch(, {token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): true
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.perf.policyresourcematcher.match -- [PERF]:http-nio-51381-exec-1:RangerDefaultPolicyResourceMatcher.getMatchType():405891:582208
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=; } }{token:USER=keyadmin, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.matchPolicyCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.matchCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): true
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}, SELF)
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- Checking for accessType:[get]
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@eb570d9)
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@eb570d9): null
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@eb570d9)
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@eb570d9)
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@eb570d9)
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, keyadmin, [], null, null)
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, keyadmin, [], null, null): true
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@eb570d9): true
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@eb570d9)
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@eb570d9): true
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-51381-exec-1:RangerPolicyItemEvaluator.isMatch(resource=null):133692:157304
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@eb570d9): true
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@eb570d9): org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator@39da6daa
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{get=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF)
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.perf.policy.request -- [PERF]:http-nio-51381-exec-1:RangerPolicyEvaluator.evaluate(requestHashCode=5dc1901e,policyId=3, policyName=all):1151086:1541940
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{get=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:21.571 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.storeAuditEnabledInCache()
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.storeAuditEnabledInCache()
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.perf.policyengine.request -- [PERF]:http-nio-51381-exec-1:RangerPolicyEngine.evaluatePolicies(requestHashCode=5dc1901e_0):2146858:2989019
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType=0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchUserGroupRole(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=true
14:11:21.572 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAction(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=true
14:11:21.573 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:11:21.573 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.573 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.573 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=; } }} accessType={get} user={keyadmin} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={get} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={keyadmin} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:11:21.573 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.573 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.573 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.573 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): null
14:11:21.573 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.logAuthzAudit(null)
14:11:21.573 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.logAuthzAudit(null)
14:11:21.573 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.573 [http-nio-51381-exec-1] DEBUG org.apache.ranger.perf.kmsauth.request -- [PERF]:http-nio-51381-exec-1:RangerKmsAuthorizer.hasAccess(type=GET):3789876:5120591
14:11:21.573 [http-nio-51381-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.hasAccess(GET, keyadmin@KRBTEST.COM (auth:KERBEROS)): true
14:11:21.573 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:21.573 [http-nio-51381-exec-1] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:21.573 [http-nio-51381-exec-1] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccess(GET, keyadmin@KRBTEST.COM (auth:KERBEROS))
14:11:21.578 [http-nio-51381-exec-1] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- <== createKey()
May 04 14:11:21 dist-test-slave-2x32 krb5kdc[6956](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903881, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
Password for test-admin@KRBTEST.COM: 
WARNING: no policy specified for kudu/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.254@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.254@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.254 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:11:21.670540 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:40767
--webserver_interface=127.25.254.254
--webserver_port=0
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:43877
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.25.254.254:40767
--ranger_config_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-client
--trusted_user_acl=test-admin
--encrypt_data_at_rest=true
--encryption_key_provider=ranger-kms
--encryption_cluster_key_name=kuduclusterkey
--ranger_kms_url=127.25.254.212:51381/kms
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:11:21.826421  7691 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:11:21.826788  7691 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:11:21.826865  7691 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:11:21.832110  7691 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260504 14:11:21.832221  7691 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:11:21.832255  7691 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:11:21.832279  7691 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260504 14:11:21.832357  7691 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260504 14:11:21.839710  7691 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:43877
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--encryption_cluster_key_name=kuduclusterkey
--encryption_key_provider=ranger-kms
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/master-0/wal
--ranger_kms_url=127.25.254.212:51381/kms
--trusted_user_acl=<redacted>
--ipki_ca_key_size=768
--master_addresses=127.25.254.254:40767
--ranger_config_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ranger-client
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.254
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.25.254.254:40767
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.25.254.254
--webserver_port=0
--webserver_require_spnego=true
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.7691
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:11:21.841243  7691 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:11:21.842410  7691 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:11:21.849733  7697 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:11:21.850001  7691 server_base.cc:1061] running on GCE node
W20260504 14:11:21.850205  7696 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:11:21.850301  7699 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:11:21.851012  7691 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:11:21.852236  7691 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:11:21.853432  7691 hybrid_clock.cc:648] HybridClock initialized: now 1777903881853398 us; error 53 us; skew 500 ppm
May 04 14:11:21 dist-test-slave-2x32 krb5kdc[6956](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903881, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:11:21.857393  7691 init.cc:377] Logged in from keytab as kudu/127.25.254.254@KRBTEST.COM (short username kudu)
I20260504 14:11:21.858989  7691 webserver.cc:492] Webserver started at http://127.25.254.254:33785/ using document root <none> and password file <none>
I20260504 14:11:21.859748  7691 fs_manager.cc:362] Metadata directory not provided
I20260504 14:11:21.859825  7691 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:11:21.860096  7691 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
May 04 14:11:21 dist-test-slave-2x32 krb5kdc[6956](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903881, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.254@KRBTEST.COM for HTTP/127.25.254.212@KRBTEST.COM
14:11:21.864 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Got token null from httpRequest http://127.25.254.212:51381/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1
14:11:21.874 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:51381/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1] triggering authentication. handler: class org.apache.hadoop.security.token.delegation.web.KerberosDelegationTokenAuthenticationHandler
14:11:21.875 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationHandler -- Falling back to class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler (req=org.apache.catalina.connector.RequestFacade@5b2d28f1)
14:11:21.886 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:51381/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1] user [kudu] authenticated
14:11:21.890 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- ==> generateEncryptedKeys(name=kuduclusterkey, eekOp=generate, numKeys=1)
14:11:21.890 [http-nio-51381-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:11:21.890 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:21.890 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:21.890 [http-nio-51381-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:11:21.890 [http-nio-51381-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(GENERATE_EEK, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey)
14:11:21.890 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.security.Groups -- GroupCacheLoader - load.
14:11:21.891 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.security.UserGroupInformation -- Failed to get groups for user kudu
java.io.IOException: No groups found for user kudu
	at org.apache.hadoop.security.Groups.noGroupsForUser(Groups.java:200)
	at org.apache.hadoop.security.Groups.access$400(Groups.java:75)
	at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:334)
	at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:270)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3529)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2278)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2155)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2045)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache.get(LocalCache.java:3962)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3985)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4946)
	at org.apache.hadoop.security.Groups.getGroups(Groups.java:228)
	at org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1755)
	at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1743)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest.<init>(RangerKmsAuthorizer.java:367)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:247)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:266)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:164)
	at org.apache.hadoop.crypto.key.kms.server.KMS.assertAccess(KMS.java:745)
	at org.apache.hadoop.crypto.key.kms.server.KMS.generateEncryptedKeys(KMS.java:529)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:11:21.891 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } }, policyType=0)
14:11:21.892 [http-nio-51381-exec-2] INFO org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- RangerPolicyEngineImpl.evaluatePolicies(39b99f52_0, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:11:21.892 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- ==> preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:11:21.892 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:11:21.892 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:11:21.892 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- getMatchedZonesForResourceAndChildren(resource=RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:11:21.892 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- No context-enrichers!!!
14:11:21.892 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- <== preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:11:21.892 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0)
14:11:21.892 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- zoneNames:[null]
14:11:21.892 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- zoneName:[null]
14:11:21.892 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:11:21.892 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:11:21.892 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:21.892 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:21.892 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.setAuditEnabledFromCache()
14:11:21.892 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.setAuditEnabledFromCache():false
14:11:21.892 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:11:21.893 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:11:21.893 [http-nio-51381-exec-2] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-51381-exec-2:RangerResourceTrie.traverse(resource=kuduclusterkey):18311:19040
14:11:21.893 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@b614234
14:11:21.893 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@b96b45e]]
14:11:21.893 [http-nio-51381-exec-2] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-51381-exec-2:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):266368:363502
14:11:21.893 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=1
14:11:21.893 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:11:21 UTC 2026)
14:11:21.893 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:11:21 UTC 2026) : true
14:11:21.893 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:21.893 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:21.893 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:11:21.893 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:11:21.893 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:11:21.893 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:11:21.893 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:11:21.894 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:11:21.894 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:21.894 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:21.894 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- isAllValuesRequested(kuduclusterkey): false
14:11:21.894 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:11:21.894 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): true
14:11:21.894 [http-nio-51381-exec-2] DEBUG org.apache.ranger.perf.policyresourcematcher.match -- [PERF]:http-nio-51381-exec-2:RangerDefaultPolicyResourceMatcher.getMatchType():470255:721289
14:11:21.894 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:11:21.894 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.matchPolicyCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:11:21.894 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.matchCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): true
14:11:21.894 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}, SELF)
14:11:21.894 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- Checking for accessType:[generateeek]
14:11:21.894 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@304a0495)
14:11:21.894 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@304a0495): null
14:11:21.894 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@304a0495)
14:11:21.894 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@304a0495)
14:11:21.894 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@304a0495)
14:11:21.895 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null)
14:11:21.895 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null): false
14:11:21.895 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@304a0495): false
14:11:21.895 [http-nio-51381-exec-2] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-51381-exec-2:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):175693:264698
14:11:21.895 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@304a0495): false
14:11:21.895 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@304a0495)
14:11:21.895 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@304a0495)
14:11:21.895 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null)
14:11:21.895 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null): true
14:11:21.895 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@304a0495): true
14:11:21.895 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@304a0495)
14:11:21.895 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@304a0495): true
14:11:21.895 [http-nio-51381-exec-2] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-51381-exec-2:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):184568:316934
14:11:21.895 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@304a0495): true
14:11:21.895 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@304a0495): org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator@24abfee
14:11:21.895 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:11:21.895 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:11:21.895 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{generateeek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF)
14:11:21.896 [http-nio-51381-exec-2] DEBUG org.apache.ranger.perf.policy.request -- [PERF]:http-nio-51381-exec-2:RangerPolicyEvaluator.evaluate(requestHashCode=39b99f52,policyId=3, policyName=all):1572702:2448275
14:11:21.896 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{generateeek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:21.896 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.storeAuditEnabledInCache()
14:11:21.896 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.storeAuditEnabledInCache()
14:11:21.896 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:21.896 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:21.896 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:21.896 [http-nio-51381-exec-2] DEBUG org.apache.ranger.perf.policyengine.request -- [PERF]:http-nio-51381-exec-2:RangerPolicyEngine.evaluatePolicies(requestHashCode=39b99f52_0):3005536:4500455
14:11:21.896 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType=0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:21.896 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:21.896 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:21.896 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:11:21.896 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:11:21.896 [http-nio-51381-exec-2] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-51381-exec-2:RangerResourceTrie.traverse(resource=kuduclusterkey):18801:19058
14:11:21.896 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@777cf3fa
14:11:21.896 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@5a92a871, org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@559c91e1]]
14:11:21.897 [http-nio-51381-exec-2] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-51381-exec-2:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):291969:571950
14:11:21.897 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=2
14:11:21.897 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:11:21.897 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:21.897 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:21.897 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:11:21.897 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:11:21.897 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:21.897 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:21.897 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:11:21.897 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.898 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.898 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:11:21.898 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchUserGroupRole(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=false
14:11:21.898 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:11:21.898 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.898 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.898 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:21 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:11:21.898 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.898 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.898 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.899 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- generateNextAuditEventId(): 7840b05c-dfe9-41b0-bf9e-3d4271db3c10-0
14:11:21.899 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={7840b05c-dfe9-41b0-bf9e-3d4271db3c10-0} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:11:21 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=7840b05c-dfe9-41b0-bf9e-3d4271db3c10-0;seq_num=0;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null}
14:11:21.899 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:11:21 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=7840b05c-dfe9-41b0-bf9e-3d4271db3c10-0;seq_num=0;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:11:21.899 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:11:21 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=7840b05c-dfe9-41b0-bf9e-3d4271db3c10-0;seq_num=1;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:11:21.899 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={7840b05c-dfe9-41b0-bf9e-3d4271db3c10-0} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:21.899 [http-nio-51381-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.hasAccess(GENERATE_EEK, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey): true
14:11:21.899 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:21.899 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:21.899 [http-nio-51381-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:11:21.901 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS)][action: org.apache.hadoop.crypto.key.kms.server.KMS$$Lambda$240/0x00007f3d08620670@656ae5cb]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.hadoop.crypto.key.kms.server.KMS.generateEncryptedKeys(KMS.java:530)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:11:21.905 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getMetadata(kuduclusterkey)
14:11:21.905 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== engineContainsAlias(kuduclusterkey): ret=true
14:11:21.905 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== engineContainsAlias(kuduclusterkey): ret=true
14:11:21.905 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> engineGetKey(kuduclusterkey)
14:11:21.905 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> unsealKey()
14:11:21.912 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== unsealKey(): ret=org.apache.hadoop.crypto.key.RangerKeyStoreProvider$KeyMetadata@4abfd75d
14:11:21.912 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== engineGetKey(kuduclusterkey): ret=org.apache.hadoop.crypto.key.RangerKeyStoreProvider$KeyMetadata@4abfd75d
14:11:21.914 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getMetadata(kuduclusterkey): ret=cipher: AES/CTR/NoPadding, length: 128, description: kuduclusterkey, created: Mon May 04 14:11:21 UTC 2026, version: 1, attributes: [key.acl.name=kuduclusterkey] 
14:11:21.914 [http-nio-51381-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:11:21.914 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:21.914 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:21.914 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:21.915 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:21.915 [http-nio-51381-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:11:21.915 [http-nio-51381-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:11:21.915 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:21.915 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:21.915 [http-nio-51381-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:11:21.915 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:21.915 [http-nio-51381-exec-2] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:21.915 [http-nio-51381-exec-2] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:11:21.915 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getMetadata(kuduclusterkey)
14:11:21.915 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getMetadata(kuduclusterkey): ret=cipher: AES/CTR/NoPadding, length: 128, description: kuduclusterkey, created: Mon May 04 14:11:21 UTC 2026, version: 1, attributes: [key.acl.name=kuduclusterkey] 
14:11:21.915 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getKeyVersion(kuduclusterkey@0)
14:11:21.915 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== engineContainsAlias(kuduclusterkey@0): ret=true
14:11:21.915 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> engineGetKey(kuduclusterkey@0)
14:11:21.915 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> unsealKey()
14:11:21.916 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== unsealKey(): ret=javax.crypto.spec.SecretKeySpec@d4aa8a79
14:11:21.916 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== engineGetKey(kuduclusterkey@0): ret=javax.crypto.spec.SecretKeySpec@d4aa8a79
14:11:21.916 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getKeyVersion(kuduclusterkey@0)
14:11:21.928 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.NativeCodeLoader -- Trying to load the custom-built native-hadoop library...
14:11:21.929 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.NativeCodeLoader -- Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path: /tmp/dist-test-taskMMfo7I/build/dist-test-system-libs/:/tmp/dist-test-taskMMfo7I/build/debug/lib:/usr/java/packages/lib:/usr/lib64:/lib64:/lib:/usr/lib
14:11:21.929 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.NativeCodeLoader -- java.library.path=/tmp/dist-test-taskMMfo7I/build/dist-test-system-libs/:/tmp/dist-test-taskMMfo7I/build/debug/lib:/usr/java/packages/lib:/usr/lib64:/lib64:/lib:/usr/lib
14:11:21.929 [http-nio-51381-exec-2] WARN org.apache.hadoop.util.NativeCodeLoader -- Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14:11:21.929 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.OpensslCipher -- Failed to load OpenSSL Cipher.
java.lang.UnsatisfiedLinkError: 'boolean org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl()'
	at org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl(Native Method)
	at org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:85)
	at org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500)
	at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:481)
	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
	at org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:69)
	at org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:102)
	at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension$DefaultCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:299)
	at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:518)
	at org.apache.hadoop.crypto.key.kms.server.EagerKeyGeneratorKeyProviderCryptoExtension$CryptoExtension$EncryptedQueueRefiller.fillQueueForKey(EagerKeyGeneratorKeyProviderCryptoExtension.java:76)
	at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:249)
	at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:243)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3529)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2278)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2155)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2045)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache.get(LocalCache.java:3962)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3985)
	at org.apache.hadoop.thirdparty.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4946)
	at org.apache.hadoop.crypto.key.kms.ValueQueue.getAtMost(ValueQueue.java:353)
	at org.apache.hadoop.crypto.key.kms.ValueQueue.getNext(ValueQueue.java:293)
	at org.apache.hadoop.crypto.key.kms.server.EagerKeyGeneratorKeyProviderCryptoExtension$CryptoExtension.generateEncryptedKey(EagerKeyGeneratorKeyProviderCryptoExtension.java:125)
	at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:518)
	at org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider.generateEncryptedKey(KeyAuthorizationKeyProvider.java:175)
	at org.apache.hadoop.crypto.key.kms.server.KMS.lambda$generateEncryptedKeys$10(KMS.java:532)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
	at java.base/javax.security.auth.Subject.doAs(Subject.java:439)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
	at org.apache.hadoop.crypto.key.kms.server.KMS.generateEncryptedKeys(KMS.java:530)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:11:21.930 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.931 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.933 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.933 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.933 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.933 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.934 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.934 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.934 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.935 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.935 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.935 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.935 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.936 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.936 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.936 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.936 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.937 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.937 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.937 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.937 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.938 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.938 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.938 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.938 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.939 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.939 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.939 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.940 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.940 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.941 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.941 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.942 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.942 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.942 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.942 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.943 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.943 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.943 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.944 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.944 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.944 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.945 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.945 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.945 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.945 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.946 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.946 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.946 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.946 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.947 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.947 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.947 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.948 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.948 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.948 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.949 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.949 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.951 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.951 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.960 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.960 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.961 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.961 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.962 [http-nio-51381-exec-2] INFO kms-audit -- OK[op=GENERATE_EEK, key=kuduclusterkey, user=kudu/127.25.254.254@KRBTEST.COM, accessCount=1, interval=0ms] 
14:11:21.962 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.962 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.962 [http-nio-51381-exec-2] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- <== generateEncryptedKeys(name=kuduclusterkey, eekOp=generate, numKeys=1)
14:11:21.962 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.962 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.963 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.963 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.963 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.963 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.964 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.964 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.964 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.964 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.964 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.965 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.965 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.965 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.965 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.965 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.966 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.966 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.966 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.966 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.967 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.967 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.967 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.967 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.967 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.968 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.968 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.968 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
I20260504 14:11:21.968142  7691 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/master-0/data/instance:
uuid: "11c48df93b2a4615b26c8114ada49e66"
format_stamp: "Formatted at 2026-05-04 14:11:21 on dist-test-slave-2x32"
server_key: "c5691fc7ef82fccacad783fe35eade56"
server_key_iv: "c8c860f1fa7092c16ece1a210a1ba792"
server_key_version: "kuduclusterkey@0"
14:11:21.968 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.968 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
I20260504 14:11:21.969161  7691 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/master-0/wal/instance:
uuid: "11c48df93b2a4615b26c8114ada49e66"
format_stamp: "Formatted at 2026-05-04 14:11:21 on dist-test-slave-2x32"
server_key: "c5691fc7ef82fccacad783fe35eade56"
server_key_iv: "c8c860f1fa7092c16ece1a210a1ba792"
server_key_version: "kuduclusterkey@0"
14:11:21.969 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.969 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.969 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.969 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.971 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.971 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.975 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.975 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.975 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.976 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.976 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.976 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.977 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.977 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.977 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.977 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.978 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.978 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.979 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
I20260504 14:11:21.980157  7691 fs_manager.cc:696] Time spent creating directory manager: real 0.011s	user 0.004s	sys 0.000s
14:11:21.982 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.982 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.982 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.983 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.983 [http-nio-51381-exec-3] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Got token null from httpRequest http://127.25.254.212:51381/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt
14:11:21.983 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.983 [http-nio-51381-exec-3] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:51381/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt] triggering authentication. handler: class org.apache.hadoop.security.token.delegation.web.KerberosDelegationTokenAuthenticationHandler
14:11:21.983 [http-nio-51381-exec-3] DEBUG org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationHandler -- Falling back to class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler (req=org.apache.catalina.connector.RequestFacade@e3a236f)
14:11:21.984 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.984 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.984 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.984 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.985 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.985 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.985 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.985 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.986 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.986 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.987 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.987 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.987 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.987 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.988 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.988 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.988 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.988 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.988 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.989 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.989 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.989 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.989 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.990 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.990 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.990 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.990 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.991 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.991 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.991 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.991 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.992 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.992 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.992 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.992 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.993 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.993 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.993 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.993 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.993 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.994 [http-nio-51381-exec-3] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:51381/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt] user [kudu] authenticated
14:11:21.994 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.998 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:21.999 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.999 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.000 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:21.999 [http-nio-51381-exec-3] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- ==> handleEncryptedKeyOp(versionName=kuduclusterkey@0, eekOp=decrypt)
14:11:22.000 [http-nio-51381-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:11:22.000 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:22.000 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:22.000 [http-nio-51381-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:11:22.000 [http-nio-51381-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(DECRYPT_EEK, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey)
14:11:22.000 [http-nio-51381-exec-3] DEBUG org.apache.hadoop.security.UserGroupInformation -- Failed to get groups for user kudu
java.io.IOException: No groups found for user kudu
	at org.apache.hadoop.security.Groups.noGroupsForUser(Groups.java:200)
	at org.apache.hadoop.security.Groups.getGroups(Groups.java:223)
	at org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1755)
	at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1743)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest.<init>(RangerKmsAuthorizer.java:367)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:247)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:266)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:164)
	at org.apache.hadoop.crypto.key.kms.server.KMS.assertAccess(KMS.java:745)
	at org.apache.hadoop.crypto.key.kms.server.KMS.handleEncryptedKeyOp(KMS.java:664)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:11:22.001 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } }, policyType=0)
14:11:22.001 [http-nio-51381-exec-3] INFO org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- RangerPolicyEngineImpl.evaluatePolicies(1ebeb4ef_0, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:11:22.001 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- ==> preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:11:22.001 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:11:22.001 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:11:22.001 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- getMatchedZonesForResourceAndChildren(resource=RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:11:22.001 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- No context-enrichers!!!
14:11:22.001 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- <== preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:11:22.001 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0)
14:11:22.001 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- zoneNames:[null]
14:11:22.001 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- zoneName:[null]
14:11:22.001 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:11:22.001 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:11:22.001 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:22.001 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:22.001 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.setAuditEnabledFromCache()
14:11:22.001 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.setAuditEnabledFromCache():false
14:11:22.001 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:11:22.001 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:11:22.000 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.002 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:22.003 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.003 [http-nio-51381-exec-3] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-51381-exec-3:RangerResourceTrie.traverse(resource=kuduclusterkey):34935:33263
14:11:22.003 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@70cb20ec
14:11:22.003 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@b96b45e]]
14:11:22.003 [http-nio-51381-exec-3] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-51381-exec-3:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):240657:1823255
14:11:22.003 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=1
14:11:22.003 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:11:22 UTC 2026)
14:11:22.003 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:11:22 UTC 2026) : true
14:11:22.003 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:22.003 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- isAllValuesRequested(kuduclusterkey): false
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): true
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.perf.policyresourcematcher.match -- [PERF]:http-nio-51381-exec-3:RangerDefaultPolicyResourceMatcher.getMatchType():391165:391480
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.matchPolicyCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.matchCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): true
14:11:22.004 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}, SELF)
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- Checking for accessType:[decrypteek]
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@665f0c5a)
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@665f0c5a): null
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@665f0c5a)
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@665f0c5a)
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@665f0c5a)
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null)
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null): false
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@665f0c5a): false
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-51381-exec-3:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):115666:115958
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@665f0c5a): false
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@665f0c5a)
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@665f0c5a)
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null)
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null): true
14:11:22.004 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@665f0c5a): true
14:11:22.005 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@665f0c5a)
14:11:22.005 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@665f0c5a): true
14:11:22.005 [http-nio-51381-exec-3] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-51381-exec-3:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):127831:128158
14:11:22.005 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@665f0c5a): true
14:11:22.005 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@665f0c5a): org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator@24abfee
14:11:22.005 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:11:22.005 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:11:22.005 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.005 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:22.005 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.005 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{decrypteek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF)
14:11:22.006 [http-nio-51381-exec-3] DEBUG org.apache.ranger.perf.policy.request -- [PERF]:http-nio-51381-exec-3:RangerPolicyEvaluator.evaluate(requestHashCode=1ebeb4ef,policyId=3, policyName=all):1373550:2389249
14:11:22.006 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{decrypteek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:22.006 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.storeAuditEnabledInCache()
14:11:22.006 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.storeAuditEnabledInCache()
14:11:22.006 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:22.006 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:22.006 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:22.006 [http-nio-51381-exec-3] DEBUG org.apache.ranger.perf.policyengine.request -- [PERF]:http-nio-51381-exec-3:RangerPolicyEngine.evaluatePolicies(requestHashCode=1ebeb4ef_0):2560671:5156401
14:11:22.006 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:22.006 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType=0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:22.006 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:22.006 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:22.006 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:11:22.006 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:11:22.006 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.007 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:22.007 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.007 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:22.007 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.008 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:22.008 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.006 [http-nio-51381-exec-3] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-51381-exec-3:RangerResourceTrie.traverse(resource=kuduclusterkey):13272:13653
14:11:22.008 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@21ef39ac
14:11:22.008 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@5a92a871, org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@559c91e1]]
14:11:22.008 [http-nio-51381-exec-3] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-51381-exec-3:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):364560:2137884
14:11:22.008 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=2
14:11:22.009 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:11:22.009 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:22.009 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:22.009 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:11:22.009 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:11:22.009 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:22.009 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:22.009 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:11:22.009 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:22.009 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.009 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:22.009 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.010 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:22.010 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.010 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:22.011 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.011 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:22.011 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.011 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:22.011 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.009 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:22.012 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:22.012 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:22.012 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:11:22.012 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchUserGroupRole(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=false
14:11:22.012 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:11:22.012 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:22.012 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:22.012 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:22 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:11:22.012 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:22.012 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:22.012 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:22.012 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- generateNextAuditEventId(): 7840b05c-dfe9-41b0-bf9e-3d4271db3c10-1
14:11:22.012 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={7840b05c-dfe9-41b0-bf9e-3d4271db3c10-1} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:11:22 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=7840b05c-dfe9-41b0-bf9e-3d4271db3c10-1;seq_num=2;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null}
14:11:22.012 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:11:22 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=7840b05c-dfe9-41b0-bf9e-3d4271db3c10-1;seq_num=2;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:11:22.012 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:11:22 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=7840b05c-dfe9-41b0-bf9e-3d4271db3c10-1;seq_num=3;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:11:22.012 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={7840b05c-dfe9-41b0-bf9e-3d4271db3c10-1} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:22.012 [http-nio-51381-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.hasAccess(DECRYPT_EEK, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey): true
14:11:22.012 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:22.012 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:22.013 [http-nio-51381-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:11:22.012 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.013 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:22.013 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.014 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:22.014 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.014 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:22.014 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.014 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:22.015 [org.apache.hadoop.crypto.key.kms.ValueQueue_thread] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.016 [http-nio-51381-exec-3] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS)][action: org.apache.hadoop.crypto.key.kms.server.KMS$$Lambda$241/0x00007f3d086231e8@25a1e4ea]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.hadoop.crypto.key.kms.server.KMS.handleEncryptedKeyOp(KMS.java:666)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:11:22.018 [http-nio-51381-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- ==> getKeyVersion(kuduclusterkey@0)
14:11:22.018 [http-nio-51381-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== engineContainsAlias(kuduclusterkey@0): ret=true
14:11:22.018 [http-nio-51381-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> engineGetKey(kuduclusterkey@0)
14:11:22.018 [http-nio-51381-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- ==> unsealKey()
14:11:22.019 [http-nio-51381-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== unsealKey(): ret=javax.crypto.spec.SecretKeySpec@d4aa8a79
14:11:22.023 [http-nio-51381-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStore -- <== engineGetKey(kuduclusterkey@0): ret=javax.crypto.spec.SecretKeySpec@d4aa8a79
14:11:22.023 [http-nio-51381-exec-3] DEBUG org.apache.hadoop.crypto.key.RangerKeyStoreProvider -- <== getKeyVersion(kuduclusterkey@0)
14:11:22.023 [http-nio-51381-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:11:22.023 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:22.023 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:22.024 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:22.024 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:22.024 [http-nio-51381-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:11:22.024 [http-nio-51381-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:11:22.024 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:22.024 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:22.024 [http-nio-51381-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:11:22.024 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:22.024 [http-nio-51381-exec-3] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:22.024 [http-nio-51381-exec-3] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.254@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:11:22.025 [http-nio-51381-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:22.025 [http-nio-51381-exec-3] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:22.025 [http-nio-51381-exec-3] INFO kms-audit -- OK[op=DECRYPT_EEK, key=kuduclusterkey, user=kudu/127.25.254.254@KRBTEST.COM, accessCount=1, interval=0ms] 
14:11:22.026 [http-nio-51381-exec-3] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- <== handleEncryptedKeyOp(versionName=kuduclusterkey@0, eekOp=decrypt)
I20260504 14:11:22.030599  7708 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:11:22.032045  7691 fs_manager.cc:730] Time spent opening block manager: real 0.003s	user 0.002s	sys 0.001s
I20260504 14:11:22.032192  7691 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/master-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/master-0/wal
uuid: "11c48df93b2a4615b26c8114ada49e66"
format_stamp: "Formatted at 2026-05-04 14:11:21 on dist-test-slave-2x32"
server_key: "c5691fc7ef82fccacad783fe35eade56"
server_key_iv: "c8c860f1fa7092c16ece1a210a1ba792"
server_key_version: "kuduclusterkey@0"
I20260504 14:11:22.032320  7691 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:11:22.059900  7691 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:11:22.063405  7691 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:11:22.063602  7691 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:11:22.072790  7691 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.254:40767
I20260504 14:11:22.072800  7760 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.254:40767 every 8 connection(s)
I20260504 14:11:22.074242  7691 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/master-0/data/info.pb
I20260504 14:11:22.077757  7761 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260504 14:11:22.083426 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 7691
I20260504 14:11:22.083578 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/master-0/wal/instance
I20260504 14:11:22.083819 26619 external_mini_cluster.cc:1468] Setting key ef4335edc5a8d6e0e0fda9d41fc0f47c
I20260504 14:11:22.084052  7761 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66: Bootstrap starting.
I20260504 14:11:22.086747  7761 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66: Neither blocks nor log segments found. Creating new log.
I20260504 14:11:22.087651  7761 log.cc:826] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66: Log is configured to *not* fsync() on all Append() calls
I20260504 14:11:22.091544  7761 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66: No bootstrap required, opened a new log
May 04 14:11:22 dist-test-slave-2x32 krb5kdc[6956](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903881, etypes {rep=17 tkt=17 ses=17}, test-admin@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
I20260504 14:11:22.096592  7761 raft_consensus.cc:359] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "11c48df93b2a4615b26c8114ada49e66" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 40767 } }
I20260504 14:11:22.096908  7761 raft_consensus.cc:385] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260504 14:11:22.096994  7761 raft_consensus.cc:740] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 11c48df93b2a4615b26c8114ada49e66, State: Initialized, Role: FOLLOWER
I20260504 14:11:22.097505  7761 consensus_queue.cc:260] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "11c48df93b2a4615b26c8114ada49e66" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 40767 } }
I20260504 14:11:22.097690  7761 raft_consensus.cc:399] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260504 14:11:22.097783  7761 raft_consensus.cc:493] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260504 14:11:22.097901  7761 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66 [term 0 FOLLOWER]: Advancing to term 1
I20260504 14:11:22.099004  7761 raft_consensus.cc:515] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "11c48df93b2a4615b26c8114ada49e66" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 40767 } }
I20260504 14:11:22.099403  7761 leader_election.cc:304] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 11c48df93b2a4615b26c8114ada49e66; no voters: 
I20260504 14:11:22.099740  7761 leader_election.cc:290] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260504 14:11:22.101001  7761 sys_catalog.cc:565] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66 [sys.catalog]: configured and running, proceeding with master startup.
I20260504 14:11:22.101122  7764 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:11:22.085400 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.0.0.1:32874 (local address 127.25.254.254:40767)
0504 14:11:22.086128 (+   728us) server_negotiation.cc:207] Beginning negotiation
0504 14:11:22.086140 (+    12us) server_negotiation.cc:400] Waiting for connection header
0504 14:11:22.086492 (+   352us) server_negotiation.cc:408] Connection header received
0504 14:11:22.087446 (+   954us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:11:22.087470 (+    24us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:11:22.087920 (+   450us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:11:22.088238 (+   318us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:11:22.089719 (+  1481us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:11:22.091017 (+  1298us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:11:22.092256 (+  1239us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:11:22.093758 (+  1502us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:11:22.095012 (+  1254us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:11:22.095038 (+    26us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:11:22.095054 (+    16us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:11:22.095089 (+    35us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:11:22.097526 (+  2437us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:11:22.098208 (+   682us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:11:22.098214 (+     6us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:11:22.098221 (+     7us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:11:22.098299 (+    78us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:11:22.098565 (+   266us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:11:22.098568 (+     3us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:11:22.098570 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:11:22.098983 (+   413us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:11:22.099942 (+   959us) server_negotiation.cc:1036] Waiting for connection context
0504 14:11:22.100117 (+   175us) server_negotiation.cc:300] Negotiation successful
0504 14:11:22.100322 (+   205us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":561,"thread_start_us":123,"threads_started":1}
I20260504 14:11:22.103458  7761 ranger_client.cc:318] Using new properties file: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/master-0/logs/kudu-ranger-subprocess-log4j2.properties
I20260504 14:11:22.104055  7766 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66 [term 1 FOLLOWER]: Leader election won for term 1
I20260504 14:11:22.104246  7766 raft_consensus.cc:697] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66 [term 1 LEADER]: Becoming Leader. State: Replica: 11c48df93b2a4615b26c8114ada49e66, State: Running, Role: LEADER
I20260504 14:11:22.104609  7766 consensus_queue.cc:237] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "11c48df93b2a4615b26c8114ada49e66" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 40767 } }
I20260504 14:11:22.110361  7778 sys_catalog.cc:455] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "11c48df93b2a4615b26c8114ada49e66" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "11c48df93b2a4615b26c8114ada49e66" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 40767 } } }
I20260504 14:11:22.110575  7778 sys_catalog.cc:458] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66 [sys.catalog]: This master's current role is: LEADER
I20260504 14:11:22.111038  7767 sys_catalog.cc:455] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 11c48df93b2a4615b26c8114ada49e66. Latest consensus state: current_term: 1 leader_uuid: "11c48df93b2a4615b26c8114ada49e66" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "11c48df93b2a4615b26c8114ada49e66" member_type: VOTER last_known_addr { host: "127.25.254.254" port: 40767 } } }
I20260504 14:11:22.111178  7767 sys_catalog.cc:458] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66 [sys.catalog]: This master's current role is: LEADER
I20260504 14:11:22.114281  7782 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260504 14:11:22.118490  7782 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260504 14:11:22.128320  7782 catalog_manager.cc:1357] Generated new cluster ID: 84839941cb5f411ebafeb83cdfb80062
I20260504 14:11:22.128477  7782 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260504 14:11:22.159255  7782 catalog_manager.cc:1380] Generated new certificate authority record
I20260504 14:11:22.160589  7782 catalog_manager.cc:1514] Loading token signing keys...
I20260504 14:11:22.176033  7782 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 11c48df93b2a4615b26c8114ada49e66: Generated new TSK 0
I20260504 14:11:22.176918  7782 catalog_manager.cc:1524] Initializing in-progress tserver states...
WARNING: no policy specified for kudu/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kudu.keytab.
WARNING: no policy specified for HTTP/127.25.254.193@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.193@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.193 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:11:22.938318 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--encryption_key_provider=ranger-kms
--encryption_cluster_key_name=kuduclusterkey
--ranger_kms_url=127.25.254.212:51381/kms
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.193:0
--local_ip_for_outbound_sockets=127.25.254.193
--webserver_interface=127.25.254.193
--webserver_port=0
--tserver_master_addrs=127.25.254.254:40767
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:43877
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
W20260504 14:11:23.061367  7819 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:11:23.061650  7819 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:11:23.061767  7819 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:11:23.065567  7819 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:11:23.065718  7819 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:11:23.065876  7819 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.193
I20260504 14:11:23.071206  7819 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:43877
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--encryption_cluster_key_name=kuduclusterkey
--encryption_key_provider=ranger-kms
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-0/wal
--ranger_kms_url=127.25.254.212:51381/kms
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.193
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.193:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.25.254.193
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:40767
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.7819
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.193
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:11:23.072652  7819 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:11:23.073736  7819 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:11:23.082492  7826 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:11:23.083666  7825 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:11:23.083973  7819 server_base.cc:1061] running on GCE node
W20260504 14:11:23.086902  7828 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:11:23.087661  7819 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:11:23.088389  7819 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:11:23.089584  7819 hybrid_clock.cc:648] HybridClock initialized: now 1777903883089546 us; error 57 us; skew 500 ppm
May 04 14:11:23 dist-test-slave-2x32 krb5kdc[6956](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903883, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:11:23.093465  7819 init.cc:377] Logged in from keytab as kudu/127.25.254.193@KRBTEST.COM (short username kudu)
I20260504 14:11:23.095073  7819 webserver.cc:492] Webserver started at http://127.25.254.193:43603/ using document root <none> and password file <none>
I20260504 14:11:23.096015  7819 fs_manager.cc:362] Metadata directory not provided
I20260504 14:11:23.096174  7819 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:11:23.096485  7819 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
May 04 14:11:23 dist-test-slave-2x32 krb5kdc[6956](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903883, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for HTTP/127.25.254.212@KRBTEST.COM
14:11:23.100 [http-nio-51381-exec-5] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Got token null from httpRequest http://127.25.254.212:51381/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1
14:11:23.101 [http-nio-51381-exec-5] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:51381/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1] triggering authentication. handler: class org.apache.hadoop.security.token.delegation.web.KerberosDelegationTokenAuthenticationHandler
14:11:23.101 [http-nio-51381-exec-5] DEBUG org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationHandler -- Falling back to class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler (req=org.apache.catalina.connector.RequestFacade@2b49d196)
14:11:23.106 [http-nio-51381-exec-5] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:51381/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1] user [kudu] authenticated
14:11:23.107 [http-nio-51381-exec-5] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- ==> generateEncryptedKeys(name=kuduclusterkey, eekOp=generate, numKeys=1)
14:11:23.108 [http-nio-51381-exec-5] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:11:23.108 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:23.108 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:23.108 [http-nio-51381-exec-5] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:11:23.108 [http-nio-51381-exec-5] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(GENERATE_EEK, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey)
14:11:23.108 [http-nio-51381-exec-5] DEBUG org.apache.hadoop.security.UserGroupInformation -- Failed to get groups for user kudu
java.io.IOException: No groups found for user kudu
	at org.apache.hadoop.security.Groups.noGroupsForUser(Groups.java:200)
	at org.apache.hadoop.security.Groups.getGroups(Groups.java:223)
	at org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1755)
	at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1743)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest.<init>(RangerKmsAuthorizer.java:367)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:247)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:266)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:164)
	at org.apache.hadoop.crypto.key.kms.server.KMS.assertAccess(KMS.java:745)
	at org.apache.hadoop.crypto.key.kms.server.KMS.generateEncryptedKeys(KMS.java:529)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:11:23.108 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } }, policyType=0)
14:11:23.108 [http-nio-51381-exec-5] INFO org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- RangerPolicyEngineImpl.evaluatePolicies(551d002a_0, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:11:23.108 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- ==> preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- getMatchedZonesForResourceAndChildren(resource=RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- No context-enrichers!!!
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- <== preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0)
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- zoneNames:[null]
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- zoneName:[null]
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.setAuditEnabledFromCache()
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.setAuditEnabledFromCache():false
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-51381-exec-5:RangerResourceTrie.traverse(resource=kuduclusterkey):18977:20006
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@7957e6e5
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@b96b45e]]
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-51381-exec-5:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):185637:185969
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=1
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:11:23 UTC 2026)
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:11:23 UTC 2026) : true
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:11:23.109 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:11:23.110 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:23.110 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:23.110 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- isAllValuesRequested(kuduclusterkey): false
14:11:23.110 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:11:23.118 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): true
14:11:23.118 [http-nio-51381-exec-5] DEBUG org.apache.ranger.perf.policyresourcematcher.match -- [PERF]:http-nio-51381-exec-5:RangerDefaultPolicyResourceMatcher.getMatchType():636476:8636299
14:11:23.118 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:11:23.118 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.matchPolicyCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:11:23.118 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.matchCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): true
14:11:23.118 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}, SELF)
14:11:23.118 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- Checking for accessType:[generateeek]
14:11:23.118 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@49cdffb)
14:11:23.118 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@49cdffb): null
14:11:23.118 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@49cdffb)
14:11:23.118 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@49cdffb)
14:11:23.118 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@49cdffb)
14:11:23.118 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null)
14:11:23.118 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null): false
14:11:23.118 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@49cdffb): false
14:11:23.118 [http-nio-51381-exec-5] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-51381-exec-5:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):112627:112447
14:11:23.118 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@49cdffb): false
14:11:23.118 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@49cdffb)
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@49cdffb)
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null)
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null): true
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@49cdffb): true
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@49cdffb)
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@49cdffb): true
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-51381-exec-5:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):128325:128918
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@49cdffb): true
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@49cdffb): org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator@24abfee
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{generateeek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF)
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.perf.policy.request -- [PERF]:http-nio-51381-exec-5:RangerPolicyEvaluator.evaluate(requestHashCode=551d002a,policyId=3, policyName=all):1585047:9584586
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{generateeek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.storeAuditEnabledInCache()
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.storeAuditEnabledInCache()
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.perf.policyengine.request -- [PERF]:http-nio-51381-exec-5:RangerPolicyEngine.evaluatePolicies(requestHashCode=551d002a_0):2615449:10613829
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType=0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-51381-exec-5:RangerResourceTrie.traverse(resource=kuduclusterkey):22374:22940
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@19543a8d
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@5a92a871, org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@559c91e1]]
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-51381-exec-5:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):165889:166549
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=2
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.119 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchUserGroupRole(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=false
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- generateNextAuditEventId(): 7840b05c-dfe9-41b0-bf9e-3d4271db3c10-2
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={7840b05c-dfe9-41b0-bf9e-3d4271db3c10-2} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:11:23 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=7840b05c-dfe9-41b0-bf9e-3d4271db3c10-2;seq_num=4;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null}
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:11:23 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=7840b05c-dfe9-41b0-bf9e-3d4271db3c10-2;seq_num=4;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:11:23 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=7840b05c-dfe9-41b0-bf9e-3d4271db3c10-2;seq_num=5;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={7840b05c-dfe9-41b0-bf9e-3d4271db3c10-2} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.hasAccess(GENERATE_EEK, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey): true
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:11:23.120 [http-nio-51381-exec-5] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS)][action: org.apache.hadoop.crypto.key.kms.server.KMS$$Lambda$240/0x00007f3d08620670@6253cf68]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.hadoop.crypto.key.kms.server.KMS.generateEncryptedKeys(KMS.java:530)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:11:23.121 [http-nio-51381-exec-5] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:11:23.121 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:23.121 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:23.121 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:23.121 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:23.121 [http-nio-51381-exec-5] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:11:23.121 [http-nio-51381-exec-5] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:11:23.121 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:23.121 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:23.121 [http-nio-51381-exec-5] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:11:23.121 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:23.121 [http-nio-51381-exec-5] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:23.121 [http-nio-51381-exec-5] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:11:23.121 [http-nio-51381-exec-5] INFO kms-audit -- OK[op=GENERATE_EEK, key=kuduclusterkey, user=kudu/127.25.254.193@KRBTEST.COM, accessCount=1, interval=0ms] 
14:11:23.121 [http-nio-51381-exec-5] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- <== generateEncryptedKeys(name=kuduclusterkey, eekOp=generate, numKeys=1)
I20260504 14:11:23.126565  7819 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-0/data/instance:
uuid: "8052a1711d6d474fa30bbc1224fef107"
format_stamp: "Formatted at 2026-05-04 14:11:23 on dist-test-slave-2x32"
server_key: "405803dfd606619a20dc1f05e40f99f5"
server_key_iv: "5129925b43f1f1fa66375e0651c5575c"
server_key_version: "kuduclusterkey@0"
I20260504 14:11:23.127367  7819 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance:
uuid: "8052a1711d6d474fa30bbc1224fef107"
format_stamp: "Formatted at 2026-05-04 14:11:23 on dist-test-slave-2x32"
server_key: "405803dfd606619a20dc1f05e40f99f5"
server_key_iv: "5129925b43f1f1fa66375e0651c5575c"
server_key_version: "kuduclusterkey@0"
I20260504 14:11:23.144512  7819 fs_manager.cc:696] Time spent creating directory manager: real 0.017s	user 0.000s	sys 0.003s
14:11:23.148 [http-nio-51381-exec-4] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Got token null from httpRequest http://127.25.254.212:51381/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt
14:11:23.148 [http-nio-51381-exec-4] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:51381/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt] triggering authentication. handler: class org.apache.hadoop.security.token.delegation.web.KerberosDelegationTokenAuthenticationHandler
14:11:23.148 [http-nio-51381-exec-4] DEBUG org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationHandler -- Falling back to class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler (req=org.apache.catalina.connector.RequestFacade@5f910aed)
14:11:23.161 [http-nio-51381-exec-4] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:51381/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt] user [kudu] authenticated
14:11:23.165 [http-nio-51381-exec-4] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- ==> handleEncryptedKeyOp(versionName=kuduclusterkey@0, eekOp=decrypt)
14:11:23.165 [http-nio-51381-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:11:23.165 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:23.165 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:23.165 [http-nio-51381-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:11:23.165 [http-nio-51381-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(DECRYPT_EEK, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey)
14:11:23.166 [http-nio-51381-exec-4] DEBUG org.apache.hadoop.security.UserGroupInformation -- Failed to get groups for user kudu
java.io.IOException: No groups found for user kudu
	at org.apache.hadoop.security.Groups.noGroupsForUser(Groups.java:200)
	at org.apache.hadoop.security.Groups.getGroups(Groups.java:223)
	at org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1755)
	at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1743)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest.<init>(RangerKmsAuthorizer.java:367)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:247)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:266)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:164)
	at org.apache.hadoop.crypto.key.kms.server.KMS.assertAccess(KMS.java:745)
	at org.apache.hadoop.crypto.key.kms.server.KMS.handleEncryptedKeyOp(KMS.java:664)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:11:23.166 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } }, policyType=0)
14:11:23.166 [http-nio-51381-exec-4] INFO org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- RangerPolicyEngineImpl.evaluatePolicies(47fa43d6_0, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- ==> preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- getMatchedZonesForResourceAndChildren(resource=RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- No context-enrichers!!!
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- <== preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0)
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- zoneNames:[null]
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- zoneName:[null]
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.setAuditEnabledFromCache()
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.setAuditEnabledFromCache():false
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-51381-exec-4:RangerResourceTrie.traverse(resource=kuduclusterkey):18015:18750
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@2131f1d1
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@b96b45e]]
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-51381-exec-4:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):182993:183774
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=1
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:11:23 UTC 2026)
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:11:23 UTC 2026) : true
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:11:23.167 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:11:23.168 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:11:23.168 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:11:23.168 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:11:23.168 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:23.168 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:23.168 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- isAllValuesRequested(kuduclusterkey): false
14:11:23.168 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:11:23.168 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): true
14:11:23.168 [http-nio-51381-exec-4] DEBUG org.apache.ranger.perf.policyresourcematcher.match -- [PERF]:http-nio-51381-exec-4:RangerDefaultPolicyResourceMatcher.getMatchType():497228:772354
14:11:23.168 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:11:23.168 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.matchPolicyCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:11:23.168 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.matchCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): true
14:11:23.168 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}, SELF)
14:11:23.168 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- Checking for accessType:[decrypteek]
14:11:23.168 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3)
14:11:23.168 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3): null
14:11:23.168 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3)
14:11:23.168 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3)
14:11:23.168 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3)
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null)
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null): false
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3): false
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-51381-exec-4:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):175612:176300
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3): false
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3)
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3)
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null)
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null): true
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3): true
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3)
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3): true
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-51381-exec-4:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):122271:123297
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3): true
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@3f4a09e3): org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator@24abfee
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{decrypteek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF)
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.perf.policy.request -- [PERF]:http-nio-51381-exec-4:RangerPolicyEvaluator.evaluate(requestHashCode=47fa43d6,policyId=3, policyName=all):1403078:1677376
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{decrypteek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.storeAuditEnabledInCache()
14:11:23.169 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.storeAuditEnabledInCache()
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.perf.policyengine.request -- [PERF]:http-nio-51381-exec-4:RangerPolicyEngine.evaluatePolicies(requestHashCode=47fa43d6_0):2712987:5298498
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType=0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-51381-exec-4:RangerResourceTrie.traverse(resource=kuduclusterkey):15622:24480
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@70441a1b
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@5a92a871, org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@559c91e1]]
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-51381-exec-4:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):225799:226765
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=2
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.172 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchUserGroupRole(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=false
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- generateNextAuditEventId(): 7840b05c-dfe9-41b0-bf9e-3d4271db3c10-3
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={7840b05c-dfe9-41b0-bf9e-3d4271db3c10-3} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:11:23 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=7840b05c-dfe9-41b0-bf9e-3d4271db3c10-3;seq_num=6;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null}
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:11:23 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=7840b05c-dfe9-41b0-bf9e-3d4271db3c10-3;seq_num=6;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:11:23 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=7840b05c-dfe9-41b0-bf9e-3d4271db3c10-3;seq_num=7;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={7840b05c-dfe9-41b0-bf9e-3d4271db3c10-3} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.hasAccess(DECRYPT_EEK, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey): true
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:11:23.173 [http-nio-51381-exec-4] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS)][action: org.apache.hadoop.crypto.key.kms.server.KMS$$Lambda$241/0x00007f3d086231e8@4ba4d938]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.hadoop.crypto.key.kms.server.KMS.handleEncryptedKeyOp(KMS.java:666)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:11:23.174 [http-nio-51381-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:11:23.174 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:23.174 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:23.174 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:23.174 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:23.174 [http-nio-51381-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:11:23.174 [http-nio-51381-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:11:23.174 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:23.174 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:23.174 [http-nio-51381-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:11:23.174 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:23.174 [http-nio-51381-exec-4] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:23.174 [http-nio-51381-exec-4] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.193@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:11:23.174 [http-nio-51381-exec-4] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:23.175 [http-nio-51381-exec-4] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:23.182 [http-nio-51381-exec-4] INFO kms-audit -- OK[op=DECRYPT_EEK, key=kuduclusterkey, user=kudu/127.25.254.193@KRBTEST.COM, accessCount=1, interval=6ms] 
14:11:23.182 [http-nio-51381-exec-4] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- <== handleEncryptedKeyOp(versionName=kuduclusterkey@0, eekOp=decrypt)
I20260504 14:11:23.186124  7840 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:11:23.187655  7819 fs_manager.cc:730] Time spent opening block manager: real 0.003s	user 0.002s	sys 0.000s
I20260504 14:11:23.187876  7819 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-0/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-0/wal
uuid: "8052a1711d6d474fa30bbc1224fef107"
format_stamp: "Formatted at 2026-05-04 14:11:23 on dist-test-slave-2x32"
server_key: "405803dfd606619a20dc1f05e40f99f5"
server_key_iv: "5129925b43f1f1fa66375e0651c5575c"
server_key_version: "kuduclusterkey@0"
I20260504 14:11:23.188161  7819 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:11:23.220831  7819 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:11:23.224539  7819 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:11:23.224882  7819 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:11:23.225636  7819 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:11:23.226882  7819 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:11:23.227005  7819 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:11:23.227121  7819 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:11:23.227188  7819 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:11:23.256747  7819 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.193:33207
I20260504 14:11:23.258116  7819 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-0/data/info.pb
I20260504 14:11:23.258469  7953 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.193:33207 every 8 connection(s)
I20260504 14:11:23.265542 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 7819
I20260504 14:11:23.265722 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-0/wal/instance
I20260504 14:11:23.266014 26619 external_mini_cluster.cc:1468] Setting key 6a7229f5fc2c4bb00af6352fce25b3df
WARNING: no policy specified for kudu/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "kudu/127.25.254.194@KRBTEST.COM" created.
May 04 14:11:23 dist-test-slave-2x32 krb5kdc[6956](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903883, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.193@KRBTEST.COM for kudu/127.25.254.254@KRBTEST.COM
Exception in thread "main" org.apache.kudu.subprocess.KuduSubprocessException: Failed to login with Kudu principal/keytab
	at org.apache.kudu.subprocess.ranger.authorization.RangerKuduAuthorizer.init(RangerKuduAuthorizer.java:87)
	at org.apache.kudu.subprocess.ranger.RangerProtocolHandler.<init>(RangerProtocolHandler.java:45)
	at org.apache.kudu.subprocess.ranger.RangerSubprocessMain.main(RangerSubprocessMain.java:39)
Caused by: org.apache.hadoop.security.KerberosAuthException: failure to login: for principal: kudu/127.25.254.254 from keytab /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kudu.keytab javax.security.auth.login.LoginException: Unable to obtain password from user

	at org.apache.hadoop.security.UserGroupInformation.doSubjectLogin(UserGroupInformation.java:2064)
	at org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytabAndReturnUGI(UserGroupInformation.java:1398)
	at org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:1136)
	at org.apache.kudu.subprocess.ranger.authorization.RangerKuduAuthorizer.init(RangerKuduAuthorizer.java:85)
	... 2 more
Caused by: javax.security.auth.login.LoginException: Unable to obtain password from user

	at jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule.java:878)
	at jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:745)
	at jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:597)
	at java.base/javax.security.auth.login.LoginContext.invoke(LoginContext.java:755)
	at java.base/javax.security.auth.login.LoginContext$4.run(LoginContext.java:679)
	at java.base/javax.security.auth.login.LoginContext$4.run(LoginContext.java:677)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
	at java.base/javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:677)
	at java.base/javax.security.auth.login.LoginContext.login(LoginContext.java:587)
	at org.apache.hadoop.security.UserGroupInformation$HadoopLoginContext.login(UserGroupInformation.java:2148)
	at org.apache.hadoop.security.UserGroupInformation.doSubjectLogin(UserGroupInformation.java:2053)
	... 5 more
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal kudu/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:11:23.314594  7956 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:11:23.269025 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:40767 (local address 127.25.254.193:53311)
0504 14:11:23.278611 (+  9586us) negotiation.cc:107] Waiting for socket to connect
0504 14:11:23.278677 (+    66us) client_negotiation.cc:175] Beginning negotiation
0504 14:11:23.279848 (+  1171us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:11:23.288481 (+  8633us) client_negotiation.cc:272] Received NEGOTIATE NegotiatePB response
0504 14:11:23.288499 (+    18us) client_negotiation.cc:395] Received NEGOTIATE response from server
0504 14:11:23.288927 (+   428us) client_negotiation.cc:190] Negotiated authn=SASL
0504 14:11:23.289704 (+   777us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:11:23.289725 (+    21us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:11:23.291111 (+  1386us) client_negotiation.cc:272] Received TLS_HANDSHAKE NegotiatePB response
0504 14:11:23.291116 (+     5us) client_negotiation.cc:528] Received TLS_HANDSHAKE response from server
0504 14:11:23.291763 (+   647us) client_negotiation.cc:515] Sending TLS_HANDSHAKE message to server
0504 14:11:23.291774 (+    11us) client_negotiation.cc:253] Sending TLS_HANDSHAKE NegotiatePB request
0504 14:11:23.292054 (+   280us) client_negotiation.cc:549] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:11:23.292827 (+   773us) client_negotiation.cc:624] Initiating SASL GSSAPI handshake
0504 14:11:23.292857 (+    30us) client_negotiation.cc:657] calling sasl_client_start()
0504 14:11:23.294898 (+  2041us) client_negotiation.cc:253] Sending SASL_INITIATE NegotiatePB request
0504 14:11:23.307785 (+ 12887us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:11:23.307794 (+     9us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:11:23.307810 (+    16us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:11:23.308757 (+   947us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:11:23.310659 (+  1902us) client_negotiation.cc:272] Received SASL_CHALLENGE NegotiatePB response
0504 14:11:23.310662 (+     3us) client_negotiation.cc:695] Received SASL_CHALLENGE response from server
0504 14:11:23.310664 (+     2us) client_negotiation.cc:761] calling sasl_client_step()
0504 14:11:23.310800 (+   136us) client_negotiation.cc:253] Sending SASL_RESPONSE NegotiatePB request
0504 14:11:23.311378 (+   578us) client_negotiation.cc:272] Received SASL_SUCCESS NegotiatePB response
0504 14:11:23.311388 (+    10us) client_negotiation.cc:716] Received SASL_SUCCESS response from server
0504 14:11:23.311755 (+   367us) client_negotiation.cc:770] Sending connection context
0504 14:11:23.313219 (+  1464us) client_negotiation.cc:241] Negotiation successful
0504 14:11:23.313546 (+   327us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"client-negotiator.queue_time_us":9305,"thread_start_us":172,"threads_started":1}
I20260504 14:11:23.315548  7958 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:11:23.270235 (+     0us) reactor.cc:678] Submitting negotiation task for server connection from 127.25.254.193:53311 (local address 127.25.254.254:40767)
0504 14:11:23.278241 (+  8006us) server_negotiation.cc:207] Beginning negotiation
0504 14:11:23.278248 (+     7us) server_negotiation.cc:400] Waiting for connection header
0504 14:11:23.280236 (+  1988us) server_negotiation.cc:408] Connection header received
0504 14:11:23.280301 (+    65us) server_negotiation.cc:366] Received NEGOTIATE NegotiatePB request
0504 14:11:23.280304 (+     3us) server_negotiation.cc:462] Received NEGOTIATE request from client
0504 14:11:23.280372 (+    68us) server_negotiation.cc:378] Sending NEGOTIATE NegotiatePB response
0504 14:11:23.280468 (+    96us) server_negotiation.cc:227] Negotiated authn=SASL
0504 14:11:23.289988 (+  9520us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:11:23.290914 (+   926us) server_negotiation.cc:378] Sending TLS_HANDSHAKE NegotiatePB response
0504 14:11:23.292414 (+  1500us) server_negotiation.cc:366] Received TLS_HANDSHAKE NegotiatePB request
0504 14:11:23.292633 (+   219us) server_negotiation.cc:658] Negotiated auth-only TLSv1.3 with cipher TLS_AES_128_GCM_SHA256 TLSv1.3 Kx=any Au=any Enc=AESGCM(128) Mac=AEAD
0504 14:11:23.295232 (+  2599us) server_negotiation.cc:366] Received SASL_INITIATE NegotiatePB request
0504 14:11:23.295262 (+    30us) server_negotiation.cc:882] Received SASL_INITIATE request from client
0504 14:11:23.295266 (+     4us) server_negotiation.cc:893] Client requested to use mechanism: GSSAPI
0504 14:11:23.295302 (+    36us) server_negotiation.cc:923] calling sasl_server_start()
0504 14:11:23.307475 (+ 12173us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:11:23.308937 (+  1462us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:11:23.308942 (+     5us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:11:23.308944 (+     2us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:11:23.309016 (+    72us) server_negotiation.cc:378] Sending SASL_CHALLENGE NegotiatePB response
0504 14:11:23.310962 (+  1946us) server_negotiation.cc:366] Received SASL_RESPONSE NegotiatePB request
0504 14:11:23.310966 (+     4us) server_negotiation.cc:957] Received SASL_RESPONSE request from client
0504 14:11:23.310967 (+     1us) server_negotiation.cc:968] calling sasl_server_step()
0504 14:11:23.311197 (+   230us) server_negotiation.cc:378] Sending SASL_SUCCESS NegotiatePB response
0504 14:11:23.311308 (+   111us) server_negotiation.cc:1036] Waiting for connection context
0504 14:11:23.315118 (+  3810us) server_negotiation.cc:300] Negotiation successful
0504 14:11:23.315282 (+   164us) negotiation.cc:326] Negotiation complete: OK
Metrics: {"server-negotiator.queue_time_us":7846,"thread_start_us":70,"threads_started":1}
I20260504 14:11:23.316833  7954 heartbeater.cc:344] Connected to a master server at 127.25.254.254:40767
I20260504 14:11:23.317119  7954 heartbeater.cc:461] Registering TS with master...
I20260504 14:11:23.317688  7954 heartbeater.cc:507] Master 127.25.254.254:40767 requested a full tablet report, sending...
I20260504 14:11:23.319523  7725 ts_manager.cc:194] Registered new tserver with Master: 8052a1711d6d474fa30bbc1224fef107 (127.25.254.193:33207)
I20260504 14:11:23.321285  7725 master_service.cc:502] Signed X509 certificate for tserver {username='kudu', principal='kudu/127.25.254.193@KRBTEST.COM'} at 127.25.254.193:53311
WARNING: no policy specified for HTTP/127.25.254.194@KRBTEST.COM; defaulting to no policy
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Principal "HTTP/127.25.254.194@KRBTEST.COM" created.
Authenticating as principal test-admin/admin@KRBTEST.COM with password.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kudu.keytab.
Entry for principal HTTP/127.25.254.194 with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kudu.keytab.
I20260504 14:11:23.340335  7811 server.cc:273] Received an EOF from the subprocess
F20260504 14:11:23.341254  7783 server.cc:406] The subprocess has exited with status 1
I20260504 14:11:23.341255 26619 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
/tmp/dist-test-taskMMfo7I/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--encrypt_data_at_rest=true
--encryption_key_provider=ranger-kms
--encryption_cluster_key_name=kuduclusterkey
--ranger_kms_url=127.25.254.212:51381/kms
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.25.254.194:0
--local_ip_for_outbound_sockets=127.25.254.194
--webserver_interface=127.25.254.194
--webserver_port=0
--tserver_master_addrs=127.25.254.254:40767
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--rpc_authentication=required
--superuser_acl=test-admin
--user_acl=test-user
--webserver_require_spnego=true
--builtin_ntp_servers=127.25.254.212:43877
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_trace_negotiation with env {KRB5CCNAME=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/krb5cc,KRB5_CONFIG=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/krb5.conf,KRB5_KDC_PROFILE=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kdc.conf,KUDU_ENABLE_KRB5_REALM_FIX=yes}
*** Check failure stack trace: ***
    @     0x7f12a761ddcd  google::LogMessage::Fail() at ??:0
    @     0x7f12a7621b93  google::LogMessage::SendToLog() at ??:0
    @     0x7f12a761d7cc  google::LogMessage::Flush() at ??:0
    @     0x7f12a761ef59  google::LogMessageFatal::~LogMessageFatal() at ??:0
    @     0x7f12a9436157  kudu::subprocess::SubprocessServer::ExitCheckerThread() at ??:0
    @     0x7f12a9432f59  _ZZN4kudu10subprocess16SubprocessServer4InitEvENKUlvE0_clEv at ??:0
    @     0x7f12a9436e85  _ZNSt17_Function_handlerIFvvEZN4kudu10subprocess16SubprocessServer4InitEvEUlvE0_E9_M_invokeERKSt9_Any_data at ??:0
    @     0x55857a22f244  std::function<>::operator()() at ??:0
    @     0x7f12a8b227ea  kudu::Thread::SuperviseThread() at ??:0
    @     0x7f12a94666db  start_thread at ??:0
    @     0x7f12a67d461f  clone at ??:0
W20260504 14:11:23.465065  7963 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260504 14:11:23.465317  7963 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260504 14:11:23.465378  7963 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260504 14:11:23.469274  7963 flags.cc:432] Enabled experimental flag: --rpc_trace_negotiation=true
W20260504 14:11:23.469355  7963 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260504 14:11:23.469485  7963 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.25.254.194
I20260504 14:11:23.474475  7963 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.25.254.212:43877
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--encryption_cluster_key_name=kuduclusterkey
--encryption_key_provider=ranger-kms
--fs_data_dirs=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-1/wal
--ranger_kms_url=127.25.254.212:51381/kms
--rpc_trace_negotiation=true
--keytab_file=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/krb5kdc/kudu.keytab
--principal=kudu/127.25.254.194
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.25.254.194:0
--rpc_server_allow_ephemeral_ports=true
--rpc_authentication=required
--superuser_acl=<redacted>
--user_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.25.254.194
--webserver_port=0
--webserver_require_spnego=true
--tserver_master_addrs=127.25.254.254:40767
--encrypt_data_at_rest=true
--never_fsync=true
--heap_profile_path=/tmp/kudu.7963
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.25.254.194
--log_dir=/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type DEBUG
built by None at 04 May 2026 13:43:24 UTC on bdcb31816ec0
build id 11740
I20260504 14:11:23.475644  7963 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260504 14:11:23.476548  7963 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260504 14:11:23.483537  7980 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:11:23.483537  7979 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260504 14:11:23.484470  7982 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260504 14:11:23.484555  7963 server_base.cc:1061] running on GCE node
I20260504 14:11:23.485231  7963 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260504 14:11:23.485984  7963 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260504 14:11:23.487185  7963 hybrid_clock.cc:648] HybridClock initialized: now 1777903883487178 us; error 63 us; skew 500 ppm
May 04 14:11:23 dist-test-slave-2x32 krb5kdc[6956](info): AS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903883, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for krbtgt/KRBTEST.COM@KRBTEST.COM
I20260504 14:11:23.490764  7963 init.cc:377] Logged in from keytab as kudu/127.25.254.194@KRBTEST.COM (short username kudu)
I20260504 14:11:23.492163  7963 webserver.cc:492] Webserver started at http://127.25.254.194:34473/ using document root <none> and password file <none>
I20260504 14:11:23.492779  7963 fs_manager.cc:362] Metadata directory not provided
I20260504 14:11:23.492868  7963 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260504 14:11:23.493062  7963 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
May 04 14:11:23 dist-test-slave-2x32 krb5kdc[6956](info): TGS_REQ (2 etypes {17 16}) 127.0.0.1: ISSUE: authtime 1777903883, etypes {rep=17 tkt=17 ses=17}, kudu/127.25.254.194@KRBTEST.COM for HTTP/127.25.254.212@KRBTEST.COM
14:11:23.496 [http-nio-51381-exec-6] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Got token null from httpRequest http://127.25.254.212:51381/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1
14:11:23.497 [http-nio-51381-exec-6] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:51381/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1] triggering authentication. handler: class org.apache.hadoop.security.token.delegation.web.KerberosDelegationTokenAuthenticationHandler
14:11:23.497 [http-nio-51381-exec-6] DEBUG org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationHandler -- Falling back to class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler (req=org.apache.catalina.connector.RequestFacade@55d5606e)
14:11:23.501 [http-nio-51381-exec-6] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:51381/kms/v1/key/kuduclusterkey/_eek?eek_op=generate&num_keys=1] user [kudu] authenticated
14:11:23.502 [http-nio-51381-exec-6] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- ==> generateEncryptedKeys(name=kuduclusterkey, eekOp=generate, numKeys=1)
14:11:23.502 [http-nio-51381-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:11:23.502 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:23.502 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:23.502 [http-nio-51381-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:11:23.502 [http-nio-51381-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(GENERATE_EEK, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey)
14:11:23.503 [http-nio-51381-exec-6] DEBUG org.apache.hadoop.security.UserGroupInformation -- Failed to get groups for user kudu
java.io.IOException: No groups found for user kudu
	at org.apache.hadoop.security.Groups.noGroupsForUser(Groups.java:200)
	at org.apache.hadoop.security.Groups.getGroups(Groups.java:223)
	at org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1755)
	at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1743)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest.<init>(RangerKmsAuthorizer.java:367)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:247)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:266)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:164)
	at org.apache.hadoop.crypto.key.kms.server.KMS.assertAccess(KMS.java:745)
	at org.apache.hadoop.crypto.key.kms.server.KMS.generateEncryptedKeys(KMS.java:529)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:11:23.503 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } }, policyType=0)
14:11:23.503 [http-nio-51381-exec-6] INFO org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- RangerPolicyEngineImpl.evaluatePolicies(1a2f8e9f_0, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:11:23.503 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- ==> preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:11:23.503 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:11:23.503 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:11:23.503 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- getMatchedZonesForResourceAndChildren(resource=RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:11:23.503 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- No context-enrichers!!!
14:11:23.503 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- <== preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:11:23.503 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0)
14:11:23.503 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- zoneNames:[null]
14:11:23.503 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- zoneName:[null]
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.setAuditEnabledFromCache()
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.setAuditEnabledFromCache():false
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-51381-exec-6:RangerResourceTrie.traverse(resource=kuduclusterkey):16062:17026
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@64874ba4
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@b96b45e]]
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-51381-exec-6:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):172388:171916
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=1
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:11:23 UTC 2026)
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:11:23 UTC 2026) : true
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- isAllValuesRequested(kuduclusterkey): false
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): true
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.perf.policyresourcematcher.match -- [PERF]:http-nio-51381-exec-6:RangerDefaultPolicyResourceMatcher.getMatchType():162787:162338
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.matchPolicyCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.matchCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): true
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}, SELF)
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- Checking for accessType:[generateeek]
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434)
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434): null
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434)
14:11:23.504 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434)
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434)
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null)
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null): false
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434): false
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-51381-exec-6:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):118531:118245
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434): false
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434)
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434)
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null)
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null): true
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434): true
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434)
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434): true
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-51381-exec-6:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):75180:75387
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434): true
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@2b7fd434): org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator@24abfee
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{generateeek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF)
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.perf.policy.request -- [PERF]:http-nio-51381-exec-6:RangerPolicyEvaluator.evaluate(requestHashCode=1a2f8e9f,policyId=3, policyName=all):805294:804970
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{generateeek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.storeAuditEnabledInCache()
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.storeAuditEnabledInCache()
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.perf.policyengine.request -- [PERF]:http-nio-51381-exec-6:RangerPolicyEngine.evaluatePolicies(requestHashCode=1a2f8e9f_0):1881183:1879613
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType=0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-51381-exec-6:RangerResourceTrie.traverse(resource=kuduclusterkey):11870:12511
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@1c3d4fbe
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@5a92a871, org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@559c91e1]]
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-51381-exec-6:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):135781:135913
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=2
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:11:23.505 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchUserGroupRole(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=false
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={generateeek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={generateeek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- generateNextAuditEventId(): 7840b05c-dfe9-41b0-bf9e-3d4271db3c10-4
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={7840b05c-dfe9-41b0-bf9e-3d4271db3c10-4} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:11:23 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=7840b05c-dfe9-41b0-bf9e-3d4271db3c10-4;seq_num=8;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null}
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:11:23 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=7840b05c-dfe9-41b0-bf9e-3d4271db3c10-4;seq_num=8;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:11:23 UTC 2026;accessType=generateeek;resourcePath=kuduclusterkey;resourceType=keyname;action=generateeek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=7840b05c-dfe9-41b0-bf9e-3d4271db3c10-4;seq_num=9;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={7840b05c-dfe9-41b0-bf9e-3d4271db3c10-4} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.hasAccess(GENERATE_EEK, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey): true
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:11:23.506 [http-nio-51381-exec-6] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS)][action: org.apache.hadoop.crypto.key.kms.server.KMS$$Lambda$240/0x00007f3d08620670@2484df5f]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.hadoop.crypto.key.kms.server.KMS.generateEncryptedKeys(KMS.java:530)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:11:23.507 [http-nio-51381-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:11:23.507 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:23.507 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:23.507 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:23.507 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:23.507 [http-nio-51381-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:11:23.507 [http-nio-51381-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:11:23.507 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:23.507 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:23.507 [http-nio-51381-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:11:23.507 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:23.507 [http-nio-51381-exec-6] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:23.507 [http-nio-51381-exec-6] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), GENERATE_EEK)
14:11:23.507 [http-nio-51381-exec-6] INFO kms-audit -- OK[op=GENERATE_EEK, key=kuduclusterkey, user=kudu/127.25.254.194@KRBTEST.COM, accessCount=1, interval=0ms] 
14:11:23.507 [http-nio-51381-exec-6] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- <== generateEncryptedKeys(name=kuduclusterkey, eekOp=generate, numKeys=1)
I20260504 14:11:23.512334  7963 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-1/data/instance:
uuid: "86d0531368604f8d8c1301b3f683803d"
format_stamp: "Formatted at 2026-05-04 14:11:23 on dist-test-slave-2x32"
server_key: "8e0445e2dfe165d4f458c90d7e12e812"
server_key_iv: "358089e756cb44373ba34fa3a43c3025"
server_key_version: "kuduclusterkey@0"
I20260504 14:11:23.513200  7963 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance:
uuid: "86d0531368604f8d8c1301b3f683803d"
format_stamp: "Formatted at 2026-05-04 14:11:23 on dist-test-slave-2x32"
server_key: "8e0445e2dfe165d4f458c90d7e12e812"
server_key_iv: "358089e756cb44373ba34fa3a43c3025"
server_key_version: "kuduclusterkey@0"
I20260504 14:11:23.517675  7963 fs_manager.cc:696] Time spent creating directory manager: real 0.004s	user 0.006s	sys 0.000s
14:11:23.520 [http-nio-51381-exec-8] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Got token null from httpRequest http://127.25.254.212:51381/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt
14:11:23.520 [http-nio-51381-exec-8] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:51381/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt] triggering authentication. handler: class org.apache.hadoop.security.token.delegation.web.KerberosDelegationTokenAuthenticationHandler
14:11:23.520 [http-nio-51381-exec-8] DEBUG org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationHandler -- Falling back to class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler (req=org.apache.catalina.connector.RequestFacade@55e61480)
14:11:23.525 [http-nio-51381-exec-8] DEBUG org.apache.hadoop.security.authentication.server.AuthenticationFilter -- Request [http://127.25.254.212:51381/kms/v1/keyversion/kuduclusterkey@0/_eek?eek_op=decrypt] user [kudu] authenticated
14:11:23.528 [http-nio-51381-exec-8] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- ==> handleEncryptedKeyOp(versionName=kuduclusterkey@0, eekOp=decrypt)
14:11:23.528 [http-nio-51381-exec-8] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:11:23.528 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:23.528 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:23.528 [http-nio-51381-exec-8] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:11:23.528 [http-nio-51381-exec-8] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccess(DECRYPT_EEK, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey)
14:11:23.528 [http-nio-51381-exec-8] DEBUG org.apache.hadoop.security.UserGroupInformation -- Failed to get groups for user kudu
java.io.IOException: No groups found for user kudu
	at org.apache.hadoop.security.Groups.noGroupsForUser(Groups.java:200)
	at org.apache.hadoop.security.Groups.getGroups(Groups.java:223)
	at org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1755)
	at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1743)
	at org.apache.ranger.authorization.kms.authorizer.RangerKMSAccessRequest.<init>(RangerKmsAuthorizer.java:367)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.hasAccess(RangerKmsAuthorizer.java:247)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:266)
	at org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer.assertAccess(RangerKmsAuthorizer.java:164)
	at org.apache.hadoop.crypto.key.kms.server.KMS.assertAccess(KMS.java:745)
	at org.apache.hadoop.crypto.key.kms.server.KMS.handleEncryptedKeyOp(KMS.java:664)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:11:23.529 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } }, policyType=0)
14:11:23.530 [http-nio-51381-exec-8] INFO org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- RangerPolicyEngineImpl.evaluatePolicies(7fbbafa0_0, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- ==> preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={null} clusterType={null} context={ISREQUESTPREPROCESSED={false} } })
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- ==> RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerSecurityZoneMatcher -- <== RangerSecurityZoneMatcher.getZonesForResourceAndChildren(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- getMatchedZonesForResourceAndChildren(resource=RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): ret=null
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- No context-enrichers!!!
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.service.RangerDefaultRequestProcessor -- <== preProcess(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0)
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- zoneNames:[null]
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.PolicyEngine -- zoneName:[null]
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null)
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateTagPolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.setAuditEnabledFromCache()
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.setAuditEnabledFromCache():false
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-51381-exec-8:RangerResourceTrie.traverse(resource=kuduclusterkey):15731:16236
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@2b4c838b
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@b96b45e]]
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-51381-exec-8:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):191514:190393
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=1
14:11:23.530 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:11:23 UTC 2026)
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.isApplicable(Mon May 04 14:11:23 UTC 2026) : true
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={-1} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } })
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- ==> hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname)
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.model.validation.RangerServiceDefHelper -- <== hierarchyHasAllResources(hierarchy=RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }, resourceNames=keyname): true
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- ==> isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} })
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== isHierarchyValidForResources(RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }) : true
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchingHierarchy(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }): [RangerResourceDef={itemId={1} name={keyname} type={string} level={10} parent={null} mandatory={true} lookupSupported={true} recursiveSupported={false} excludesSupported={false} matcher={org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher} matcherOptions={{wildCard=true, ignoreCase=false}} validationRegEx={null} validationMessage={null} uiHint={null} label={Key Name} description={Key Name} rbKeyLabel={null} rbKeyDescription={null} rbKeyValidationMessage={null} accessTypeRestrictions={[]} isValidLeaf={true} }]
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- ==> RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null})
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.resourcematcher.RangerAbstractResourceMatcher -- isAllValuesRequested(kuduclusterkey): false
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.getMatchType(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher -- <== RangerDefaultResourceMatcher.isMatch(kuduclusterkey, {token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): true
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.perf.policyresourcematcher.match -- [PERF]:http-nio-51381-exec-8:RangerDefaultPolicyResourceMatcher.getMatchType():250121:250354
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyresourcematcher.RangerDefaultPolicyResourceMatcher -- <== RangerDefaultPolicyResourceMatcher.getMatchType(RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }{token:USER=kudu, _REQUEST=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, ISREQUESTPREPROCESSED=true, RESOURCE_ZONE_NAMES=null}): SELF
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.matchPolicyCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } })
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.matchCustomConditions(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): true
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}, SELF)
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- Checking for accessType:[decrypteek]
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70188ed)
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70188ed): null
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70188ed)
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70188ed)
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70188ed)
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null)
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }RangerPolicyItemAccess={type={getkeys} isAllowed={true} }RangerPolicyItemAccess={type={get} isAllowed={true} }} users={keyadmin rangerkms } groups={} roles={} conditions={} delegateAdmin={true} }, kudu, [], null, null): false
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70188ed): false
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-51381-exec-8:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):310129:310927
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70188ed): false
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70188ed)
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70188ed)
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null)
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroup(RangerPolicyItem={accessTypes={RangerPolicyItemAccess={type={getmetadata} isAllowed={true} }RangerPolicyItemAccess={type={generateeek} isAllowed={true} }RangerPolicyItemAccess={type={create} isAllowed={true} }RangerPolicyItemAccess={type={decrypteek} isAllowed={true} }} users={kudu } groups={} roles={} conditions={} delegateAdmin={false} }, kudu, [], null, null): true
14:11:23.531 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchUserGroupAndOwner(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70188ed): true
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- ==> RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70188ed)
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.matchCustomConditions(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70188ed): true
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.perf.policyitem.request -- [PERF]:http-nio-51381-exec-8:RangerPolicyItemEvaluator.isMatch(resource=kuduclusterkey):107072:107241
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator -- <== RangerDefaultPolicyItemEvaluator.isMatch(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70188ed): true
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.getMatchingPolicyItem(org.apache.ranger.plugin.policyengine.RangerAccessRequestWrapper@70188ed): org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyItemEvaluator@24abfee
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- ==> RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.updateAccessResult(RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF, true, null, 3)
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluatePolicyItems(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{decrypteek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}, SELF)
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.perf.policy.request -- [PERF]:http-nio-51381-exec-8:RangerPolicyEvaluator.evaluate(requestHashCode=7fbbafa0,policyId=3, policyName=all):1109205:1109239
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerDefaultPolicyEvaluator -- <== RangerDefaultPolicyEvaluator.evaluate(policyId=3, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ALL_ACCESS_TYPE_RESULTS={{decrypteek=RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}}} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={false} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- ==> RangerPolicyRepository.storeAuditEnabledInCache()
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.storeAuditEnabledInCache()
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesForOneAccessTypeNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0, zoneName=null): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.zoneAwareAccessEvaluationWithNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType =0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.perf.policyengine.request -- [PERF]:http-nio-51381-exec-8:RangerPolicyEngine.evaluatePolicies(requestHashCode=7fbbafa0_0):2609760:2608158
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, policyType=0): RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}})
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- ==> RangerPolicyResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey})
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- ==> RangerResourceTrie.traverse(kuduclusterkey, null)
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.perf.resourcetrie.op -- [PERF]:http-nio-51381-exec-8:RangerResourceTrie.traverse(resource=kuduclusterkey):12081:12611
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerResourceTrie -- <== RangerResourceTrie.traverse(kuduclusterkey, null): evaluators=org.apache.ranger.plugin.policyengine.RangerResourceTrie$EvalCollector@7a330538
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.util.RangerResourceEvaluatorsRetriever -- <== RangerResourceEvaluatorsRetriever.getEvaluators({keyname=kuduclusterkey}) : evaluator:[[org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@5a92a871, org.apache.ranger.plugin.policyevaluator.RangerAbstractPolicyEvaluator$RangerDefaultPolicyResourceEvaluator@559c91e1]]
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.perf.resourcetrie.retrieval -- [PERF]:http-nio-51381-exec-8:RangerPolicyRepository.getLikelyMatchEvaluators(resource=kuduclusterkey):132208:132787
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyRepository -- <== RangerPolicyRepository.getLikelyMatchPolicyEvaluators(kuduclusterkey): evaluatorCount=2
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={2} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={2} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}}): ret=false
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(2, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={1} reason={null} additionalInfo={}})
14:11:23.532 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- ==> RangerPolicyEngineImpl.evaluateResourceAuditPolicies(): Evaluating RangerPolicyEvaluator...: RangerAuditPolicyEvaluator={RangerDefaultPolicyEvaluator={RangerAbstractPolicyEvaluator={policy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} }resourceMatcher={RangerDefaultPolicyResourceMatcher={isInitialized=true, matchers={} }} }auditPolicy={RangerPolicy={id={1} guid={null} isEnabled={true} createdBy={null} updatedBy={null} createTime={null} updateTime={null} version={null} service={null} name={null} policyType={3} policyPriority={1} description={null} resourceSignature={null} isAuditEnabled={true} serviceType={null} resources={} additionalResources={} policyLabels={} policyConditions={} policyItems={} denyPolicyItems={} allowExceptions={} denyExceptions={} dataMaskPolicyItems={} rowFilterPolicyItems={} options={} validitySchedules={, zoneName=null, isDenyAllElse={false} }}} matchAnyResource={true}}
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- ==> RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchAccessResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=true
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.matchUserGroupRole(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }): ret=false
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- RangerAuditPolicyItemEvaluator.isMatch(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluatePolicyItems(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyevaluator.RangerAuditPolicyEvaluator -- <== RangerAuditPolicyEvaluator.evaluate(1, RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateResourceAuditPolicies(request=RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={keyname=kuduclusterkey; } }} accessType={decrypteek} user={kudu} userGroups={} userRoles={} accessTime={Mon May 04 14:11:23 UTC 2026} clientIPAddress={127.0.0.1} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={decrypteek} requestData={null} sessionId={null} resourceMatchingScope={SELF} resourceElementMatchingScopes={{}} clusterName={} clusterType={} context={ISANYACCESS={null} token:USER={kudu} ISREQUESTPREPROCESSED={true} RESOURCE_ZONE_NAMES={null} } }, result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={false} isAudited={false} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): ret=false
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl -- <== RangerPolicyEngineImpl.evaluateAuditPolicies(result=RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={null} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- generateNextAuditEventId(): 7840b05c-dfe9-41b0-bf9e-3d4271db3c10-5
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.getAuthzEvents(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={7840b05c-dfe9-41b0-bf9e-3d4271db3c10-5} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}}): AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:11:23 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=7840b05c-dfe9-41b0-bf9e-3d4271db3c10-5;seq_num=10;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null}
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- ==> RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:11:23 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=7840b05c-dfe9-41b0-bf9e-3d4271db3c10-5;seq_num=10;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.logAuthzAudit(AuthzAuditEvent{repositoryType=7;repositoryName=kms;user=kudu;eventTime=Mon May 04 14:11:23 UTC 2026;accessType=decrypteek;resourcePath=kuduclusterkey;resourceType=keyname;action=decrypteek;accessResult=1;agentId=kms;policyId=3;resultReason=null;aclEnforcer=ranger-acl;sessionId=null;clientType=null;clientIP=127.0.0.1;requestData=null;agentHostname=dist-test-slave-2x32;logType=RangerAudit;eventId=7840b05c-dfe9-41b0-bf9e-3d4271db3c10-5;seq_num=11;event_count=1;event_dur_ms=0;tags=[];clusterName=;zoneName=null;policyVersion=1;additionalInfo=null})
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.audit.RangerDefaultAuditHandler -- <== RangerDefaultAuditHandler.processResult(RangerAccessResult={isAccessDetermined={true} isAllowed={true} isAuditedDetermined={true} isAudited={true} auditLogId={7840b05c-dfe9-41b0-bf9e-3d4271db3c10-5} policyType={0} policyId={3} zoneName={null} auditPolicyId={3} policyVersion={1} evaluatedPoliciesCount={2} reason={null} additionalInfo={}})
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerkmsAuthorizer.hasAccess(DECRYPT_EEK, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS) , kuduclusterkey): true
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.assertAccess(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:11:23.533 [http-nio-51381-exec-8] DEBUG org.apache.hadoop.security.UserGroupInformation -- PrivilegedAction [as: kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS)][action: org.apache.hadoop.crypto.key.kms.server.KMS$$Lambda$241/0x00007f3d086231e8@1748e33b]
java.lang.Exception: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1896)
	at org.apache.hadoop.crypto.key.kms.server.KMS.handleEncryptedKeyOp(KMS.java:666)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:623)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:199)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.crypto.key.kms.server.KMSMDCFilter.doFilter(KMSMDCFilter.java:92)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:650)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:305)
	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
	at org.apache.hadoop.crypto.key.kms.server.KMSAuthenticationFilter.doFilter(KMSAuthenticationFilter.java:143)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:168)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:144)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:168)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:90)
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:482)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:130)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:93)
	at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:660)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:346)
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:396)
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:63)
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:937)
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791)
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:52)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1190)
	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:63)
	at java.base/java.lang.Thread.run(Thread.java:840)
14:11:23.534 [http-nio-51381-exec-8] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:11:23.534 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:23.534 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:23.534 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:23.534 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:23.534 [http-nio-51381-exec-8] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.isACLPresent(kuduclusterkey, MANAGEMENT)
14:11:23.534 [http-nio-51381-exec-8] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- ==> RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:11:23.534 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.activate()
14:11:23.534 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.activate()
14:11:23.534 [http-nio-51381-exec-8] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:11:23.534 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- ==> RangerPluginClassLoader.deactivate()
14:11:23.534 [http-nio-51381-exec-8] DEBUG org.apache.ranger.plugin.classloader.RangerPluginClassLoader -- <== RangerPluginClassLoader.deactivate()
14:11:23.534 [http-nio-51381-exec-8] DEBUG org.apache.ranger.authorization.kms.authorizer.RangerKmsAuthorizer -- <== RangerKmsAuthorizer.hasAccessToKey(kuduclusterkey, kudu/127.25.254.194@KRBTEST.COM (auth:KERBEROS), DECRYPT_EEK)
14:11:23.534 [http-nio-51381-exec-8] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
14:11:23.534 [http-nio-51381-exec-8] DEBUG org.apache.hadoop.util.PerformanceAdvisory -- Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
14:11:23.535 [http-nio-51381-exec-8] INFO kms-audit -- OK[op=DECRYPT_EEK, key=kuduclusterkey, user=kudu/127.25.254.194@KRBTEST.COM, accessCount=1, interval=0ms] 
14:11:23.535 [http-nio-51381-exec-8] DEBUG org.apache.hadoop.crypto.key.kms.server.KMS -- <== handleEncryptedKeyOp(versionName=kuduclusterkey@0, eekOp=decrypt)
I20260504 14:11:23.543985  7990 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:11:23.545401  7963 fs_manager.cc:730] Time spent opening block manager: real 0.007s	user 0.002s	sys 0.001s
I20260504 14:11:23.545565  7963 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-1/data,/tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-1/wal
uuid: "86d0531368604f8d8c1301b3f683803d"
format_stamp: "Formatted at 2026-05-04 14:11:23 on dist-test-slave-2x32"
server_key: "8e0445e2dfe165d4f458c90d7e12e812"
server_key_iv: "358089e756cb44373ba34fa3a43c3025"
server_key_version: "kuduclusterkey@0"
I20260504 14:11:23.545713  7963 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260504 14:11:23.567090  7963 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260504 14:11:23.570892  7963 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260504 14:11:23.571121  7963 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260504 14:11:23.571702  7963 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260504 14:11:23.572654  7963 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260504 14:11:23.572731  7963 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:11:23.572803  7963 ts_tablet_manager.cc:616] Registered 0 tablets
I20260504 14:11:23.572853  7963 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260504 14:11:23.583966  7963 rpc_server.cc:307] RPC server started. Bound to: 127.25.254.194:37821
I20260504 14:11:23.583979  8103 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.25.254.194:37821 every 8 connection(s)
I20260504 14:11:23.585071  7963 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-1/data/info.pb
I20260504 14:11:23.589604 26619 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskMMfo7I/build/debug/bin/kudu as pid 7963
I20260504 14:11:23.589804 26619 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskMMfo7I/test-tmp/security-itest.0.SecurityITest.TestEncryptionWithKMSIntegrationMultipleServers.1777903638260922-26619-0/minicluster-data/ts-1/wal/instance
I20260504 14:11:23.590082 26619 external_mini_cluster.cc:1468] Setting key a42e6fc8f5cb4ffede72e3275438c238
W20260504 14:11:23.612336  6954 connection.cc:570] client connection to 127.25.254.254:40767 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260504 14:11:23.613173  8106 negotiation.cc:336] Failed RPC negotiation. Trace:
0504 14:11:23.587949 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:40767 (local address 127.25.254.194:54845)
0504 14:11:23.588453 (+   504us) negotiation.cc:107] Waiting for socket to connect
0504 14:11:23.588499 (+    46us) client_negotiation.cc:175] Beginning negotiation
0504 14:11:23.589458 (+   959us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0504 14:11:23.612378 (+ 22920us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:40767: BlockingRecv error: recv error from unknown peer: Transport endpoint is not connected (error 107)
Metrics: {"client-negotiator.queue_time_us":284,"thread_start_us":102,"threads_started":1}
W20260504 14:11:23.613706  8104 heartbeater.cc:646] Failed to heartbeat to 127.25.254.254:40767 (0 consecutive failures): Network error: Failed to ping master at 127.25.254.254:40767: Client connection negotiation failed: client connection to 127.25.254.254:40767: BlockingRecv error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20260504 14:11:23.614637  8106 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:11:23.614318 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:40767 (local address 127.25.254.194:53939)
0504 14:11:23.614494 (+   176us) negotiation.cc:107] Waiting for socket to connect
0504 14:11:23.614557 (+    63us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:40767: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":74}
I20260504 14:11:23.615413  8106 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:11:23.615173 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:40767 (local address 127.25.254.194:42685)
0504 14:11:23.615304 (+   131us) negotiation.cc:107] Waiting for socket to connect
0504 14:11:23.615347 (+    43us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:40767: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":60}
W20260504 14:11:23.615587  8104 heartbeater.cc:412] Failed 3 heartbeats in a row: no longer allowing fast heartbeat attempts.
I20260504 14:11:24.324190  7954 heartbeater.cc:499] Master 127.25.254.254:40767 was elected leader, sending a full tablet report...
I20260504 14:11:24.325222  8108 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:11:24.324743 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:40767 (local address 127.25.254.193:40745)
0504 14:11:24.325059 (+   316us) negotiation.cc:107] Waiting for socket to connect
0504 14:11:24.325128 (+    69us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:40767: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":227,"thread_start_us":101,"threads_started":1}
W20260504 14:11:24.325599  7954 heartbeater.cc:646] Failed to heartbeat to 127.25.254.254:40767 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.25.254.254:40767: connect: Connection refused (error 111)
I20260504 14:11:24.616773  8109 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:11:24.616270 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:40767 (local address 127.25.254.194:35735)
0504 14:11:24.616580 (+   310us) negotiation.cc:107] Waiting for socket to connect
0504 14:11:24.616638 (+    58us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:40767: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":216,"thread_start_us":84,"threads_started":1}
I20260504 14:11:25.326953  8110 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:11:25.326440 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:40767 (local address 127.25.254.193:54285)
0504 14:11:25.326778 (+   338us) negotiation.cc:107] Waiting for socket to connect
0504 14:11:25.326830 (+    52us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:40767: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":244,"thread_start_us":137,"threads_started":1}
I20260504 14:11:25.618445  8111 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:11:25.617807 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:40767 (local address 127.25.254.194:36739)
0504 14:11:25.618284 (+   477us) negotiation.cc:107] Waiting for socket to connect
0504 14:11:25.618341 (+    57us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:40767: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":308,"thread_start_us":169,"threads_started":1}
I20260504 14:11:26.328428  8112 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:11:26.327943 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:40767 (local address 127.25.254.193:53869)
0504 14:11:26.328268 (+   325us) negotiation.cc:107] Waiting for socket to connect
0504 14:11:26.328329 (+    61us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:40767: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":235,"thread_start_us":83,"threads_started":1}
I20260504 14:11:26.619830  8113 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:11:26.619348 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:40767 (local address 127.25.254.194:50135)
0504 14:11:26.619672 (+   324us) negotiation.cc:107] Waiting for socket to connect
0504 14:11:26.619730 (+    58us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:40767: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":237,"thread_start_us":105,"threads_started":1}
I20260504 14:11:27.329867  8114 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:11:27.329414 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:40767 (local address 127.25.254.193:54765)
0504 14:11:27.329700 (+   286us) negotiation.cc:107] Waiting for socket to connect
0504 14:11:27.329768 (+    68us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:40767: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":201,"thread_start_us":119,"threads_started":1}
I20260504 14:11:27.621392  8115 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:11:27.620911 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:40767 (local address 127.25.254.194:51441)
0504 14:11:27.621235 (+   324us) negotiation.cc:107] Waiting for socket to connect
0504 14:11:27.621291 (+    56us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:40767: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":238,"thread_start_us":89,"threads_started":1}
I20260504 14:11:28.331454  8116 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:11:28.330993 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:40767 (local address 127.25.254.193:43131)
0504 14:11:28.331297 (+   304us) negotiation.cc:107] Waiting for socket to connect
0504 14:11:28.331356 (+    59us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:40767: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":219,"thread_start_us":86,"threads_started":1}
I20260504 14:11:28.623057  8117 negotiation.cc:338] RPC negotiation tracing enabled. Trace:
0504 14:11:28.622573 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.25.254.254:40767 (local address 127.25.254.194:33869)
0504 14:11:28.622875 (+   302us) negotiation.cc:107] Waiting for socket to connect
0504 14:11:28.622935 (+    60us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.25.254.254:40767: connect: Connection refused (error 111)
Metrics: {"client-negotiator.queue_time_us":212,"thread_start_us":85,"threads_started":1}
F20260504 14:11:28.661520 26619 external_mini_cluster.cc:650] Check failed: leader_master() != nullptr Must have started at least 1 master before adding tablet servers
*** Check failure stack trace: ***
*** Aborted at 1777903888 (unix time) try "date -d @1777903888" if you are using GNU date ***
PC: @                0x0 (unknown)
*** SIGABRT (@0x3e8000067fb) received by PID 26619 (TID 0x7f508edbfd80) from PID 26619; stack trace: ***
    @     0x7f509aa5d980 (unknown) at ??:0
    @     0x7f50980afe87 gsignal at ??:0
    @     0x7f50980b17f1 abort at ??:0
    @     0x7f509931cdcd google::LogMessage::Fail() at ??:0
    @     0x7f5099320b93 google::LogMessage::SendToLog() at ??:0
    @     0x7f509931c7cc google::LogMessage::Flush() at ??:0
    @     0x7f509931df59 google::LogMessageFatal::~LogMessageFatal() at ??:0
    @     0x7f509c0a043b kudu::cluster::ExternalMiniCluster::AddTabletServer() at ??:0
    @     0x7f509c09d156 kudu::cluster::ExternalMiniCluster::Start() at ??:0
    @     0x5596e7d7a7bf kudu::SecurityITest::StartCluster() at ??:0
    @     0x5596e7d62aa5 kudu::SecurityITest_TestEncryptionWithKMSIntegrationMultipleServers_Test::TestBody() at ??:0
    @     0x7f509a3280ed testing::internal::HandleExceptionsInMethodIfSupported<>() at ??:0
    @     0x7f509a31cbdb testing::Test::Run() at ??:0
    @     0x7f509a31cd9d testing::TestInfo::Run() at ??:0
    @     0x7f509a31d377 testing::TestSuite::Run() at ??:0
    @     0x7f509a31d77c testing::internal::UnitTestImpl::RunAllTests() at ??:0
    @     0x7f509a32860d testing::internal::HandleExceptionsInMethodIfSupported<>() at ??:0
    @     0x7f509a31ce63 testing::UnitTest::Run() at ??:0
    @     0x5596e7db2b0e RUN_ALL_TESTS() at ??:0
    @     0x5596e7db1cfb main at ??:0
    @     0x7f5098092c87 __libc_start_main at ??:0
    @     0x5596e7d49efa _start at ??:0
2026-05-04 14:11:28.851 UTC [7667] FATAL:  terminating connection due to unexpected postmaster exit
2026-05-04 14:11:28.853 UTC [7497] FATAL:  terminating connection due to unexpected postmaster exit
2026-05-04 14:11:28.853 UTC [7504] FATAL:  terminating connection due to unexpected postmaster exit
2026-05-04 14:11:28.853 UTC [7501] FATAL:  terminating connection due to unexpected postmaster exit
2026-05-04 14:11:28.854 UTC [7503] FATAL:  terminating connection due to unexpected postmaster exit
2026-05-04 14:11:28.855 UTC [7502] FATAL:  terminating connection due to unexpected postmaster exit